Mar 20 16:00:43 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 16:00:43 crc restorecon[4692]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 16:00:43 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:44 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:00:45 crc restorecon[4692]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 16:00:45 crc restorecon[4692]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 16:00:45 crc kubenswrapper[4708]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 16:00:45 crc kubenswrapper[4708]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 16:00:45 crc kubenswrapper[4708]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 16:00:45 crc kubenswrapper[4708]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 16:00:45 crc kubenswrapper[4708]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 16:00:45 crc kubenswrapper[4708]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.865782 4708 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870365 4708 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870419 4708 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870423 4708 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870428 4708 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870432 4708 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870436 4708 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870442 4708 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870448 4708 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870451 4708 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870456 4708 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870460 4708 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870464 4708 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870468 4708 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870473 4708 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870477 4708 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870499 4708 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870504 4708 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870508 4708 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870513 4708 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870517 4708 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870521 4708 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870526 4708 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870531 4708 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870536 4708 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870540 4708 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870544 4708 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870549 4708 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870558 4708 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870562 4708 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870566 4708 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870570 4708 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870573 4708 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870577 4708 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870581 4708 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870585 4708 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870588 4708 feature_gate.go:330] unrecognized feature gate: Example Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870592 4708 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870595 4708 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870599 4708 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870602 4708 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870606 4708 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870609 4708 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870613 4708 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870617 4708 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870622 4708 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870627 4708 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870631 4708 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870634 4708 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870638 4708 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870641 4708 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870645 4708 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870648 4708 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870652 4708 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870657 4708 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870662 4708 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870680 4708 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870685 4708 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870689 4708 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870693 4708 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870697 4708 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870701 4708 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870705 4708 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870708 4708 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870712 4708 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870716 4708 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870719 4708 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870723 4708 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870727 4708 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870731 4708 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870735 4708 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.870739 4708 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871501 4708 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871517 4708 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871531 4708 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871537 4708 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871545 4708 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871549 4708 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871555 4708 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871562 4708 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871566 4708 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871571 4708 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871575 4708 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871580 4708 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871584 4708 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871592 4708 flags.go:64] FLAG: --cgroup-root="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871597 4708 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871601 4708 flags.go:64] FLAG: --client-ca-file="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871605 4708 flags.go:64] FLAG: --cloud-config="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871609 4708 flags.go:64] FLAG: --cloud-provider="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871614 4708 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871619 4708 flags.go:64] FLAG: --cluster-domain="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871624 4708 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871628 4708 flags.go:64] FLAG: --config-dir="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871632 4708 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871637 4708 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871643 4708 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871648 4708 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871652 4708 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871657 4708 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871661 4708 flags.go:64] FLAG: --contention-profiling="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871681 4708 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871685 4708 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871690 4708 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871714 4708 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871720 4708 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871725 4708 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871729 4708 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871733 4708 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871737 4708 flags.go:64] FLAG: --enable-server="true" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871743 4708 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871750 4708 flags.go:64] FLAG: --event-burst="100" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871754 4708 flags.go:64] FLAG: --event-qps="50" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871759 4708 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871763 4708 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871768 4708 flags.go:64] FLAG: --eviction-hard="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871779 4708 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871786 4708 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871791 4708 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871795 4708 flags.go:64] FLAG: --eviction-soft="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871799 4708 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871804 4708 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871809 4708 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871814 4708 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871818 4708 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871822 4708 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871827 4708 flags.go:64] FLAG: --feature-gates="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871833 4708 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871838 4708 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871842 4708 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871847 4708 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871852 4708 flags.go:64] FLAG: --healthz-port="10248" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871856 4708 flags.go:64] FLAG: --help="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871861 4708 flags.go:64] FLAG: --hostname-override="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871865 4708 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871870 4708 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871874 4708 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871879 4708 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871883 4708 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871887 4708 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871891 4708 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871896 4708 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871900 4708 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871904 4708 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871909 4708 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871913 4708 flags.go:64] FLAG: --kube-reserved="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871917 4708 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871922 4708 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871926 4708 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871931 4708 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871936 4708 flags.go:64] FLAG: --lock-file="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871940 4708 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871945 4708 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871950 4708 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871957 4708 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871961 4708 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871965 4708 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871969 4708 flags.go:64] FLAG: --logging-format="text" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871973 4708 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871978 4708 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871982 4708 flags.go:64] FLAG: --manifest-url="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871986 4708 flags.go:64] FLAG: --manifest-url-header="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871992 4708 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.871997 4708 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872002 4708 flags.go:64] FLAG: --max-pods="110" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872007 4708 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872011 4708 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872015 4708 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872019 4708 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872024 4708 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872028 4708 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872033 4708 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872240 4708 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872252 4708 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872263 4708 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872272 4708 flags.go:64] FLAG: --pod-cidr="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872281 4708 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872299 4708 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872307 4708 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872315 4708 flags.go:64] FLAG: --pods-per-core="0" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872322 4708 flags.go:64] FLAG: --port="10250" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872330 4708 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872340 4708 flags.go:64] FLAG: --provider-id="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872346 4708 flags.go:64] FLAG: --qos-reserved="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872354 4708 flags.go:64] FLAG: --read-only-port="10255" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872361 4708 flags.go:64] FLAG: --register-node="true" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872368 4708 flags.go:64] FLAG: --register-schedulable="true" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872379 4708 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872393 4708 flags.go:64] FLAG: --registry-burst="10" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872400 4708 flags.go:64] FLAG: --registry-qps="5" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872406 4708 flags.go:64] FLAG: --reserved-cpus="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872412 4708 flags.go:64] FLAG: --reserved-memory="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872421 4708 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872427 4708 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872434 4708 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872440 4708 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872447 4708 flags.go:64] FLAG: --runonce="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872453 4708 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872460 4708 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872466 4708 flags.go:64] FLAG: --seccomp-default="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872472 4708 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872479 4708 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872488 4708 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872495 4708 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872502 4708 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872508 4708 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872515 4708 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872522 4708 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872528 4708 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872534 4708 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872541 4708 flags.go:64] FLAG: --system-cgroups="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872547 4708 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872558 4708 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872564 4708 flags.go:64] FLAG: --tls-cert-file="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872573 4708 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872585 4708 flags.go:64] FLAG: --tls-min-version="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872591 4708 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872597 4708 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872606 4708 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872613 4708 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872619 4708 flags.go:64] FLAG: --v="2" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872630 4708 flags.go:64] FLAG: --version="false" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872639 4708 flags.go:64] FLAG: --vmodule="" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872649 4708 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.872656 4708 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.872987 4708 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.872996 4708 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873003 4708 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873009 4708 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873015 4708 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873020 4708 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873025 4708 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873031 4708 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873043 4708 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873048 4708 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873053 4708 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873063 4708 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873069 4708 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873074 4708 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873079 4708 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873085 4708 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873090 4708 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873095 4708 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873100 4708 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873105 4708 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873110 4708 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873118 4708 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873126 4708 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873131 4708 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873136 4708 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873144 4708 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873151 4708 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873156 4708 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873162 4708 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873167 4708 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873172 4708 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873177 4708 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873183 4708 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873188 4708 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873197 4708 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873202 4708 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873207 4708 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873212 4708 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873217 4708 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873222 4708 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873230 4708 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873235 4708 feature_gate.go:330] unrecognized feature gate: Example Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873240 4708 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873245 4708 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873250 4708 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873255 4708 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873260 4708 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873265 4708 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873272 4708 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873279 4708 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873286 4708 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873296 4708 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873303 4708 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873308 4708 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873314 4708 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873319 4708 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873324 4708 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873329 4708 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873334 4708 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873340 4708 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873344 4708 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873349 4708 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873354 4708 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873359 4708 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873364 4708 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873369 4708 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873374 4708 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873379 4708 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873384 4708 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873389 4708 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.873396 4708 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.873406 4708 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.885698 4708 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.885763 4708 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885861 4708 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885874 4708 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885880 4708 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885886 4708 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885891 4708 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885896 4708 feature_gate.go:330] unrecognized feature gate: Example Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885900 4708 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885906 4708 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885910 4708 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885915 4708 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885920 4708 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885926 4708 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885934 4708 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885940 4708 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885946 4708 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885951 4708 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885956 4708 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885960 4708 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885966 4708 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885970 4708 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885975 4708 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885979 4708 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885983 4708 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885987 4708 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.885992 4708 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886047 4708 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886055 4708 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886060 4708 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886065 4708 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886069 4708 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886076 4708 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886081 4708 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886086 4708 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886091 4708 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886098 4708 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886103 4708 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886107 4708 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886113 4708 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886118 4708 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886123 4708 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886128 4708 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886133 4708 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886138 4708 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886143 4708 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886148 4708 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886152 4708 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886157 4708 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886162 4708 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886167 4708 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886171 4708 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886176 4708 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886180 4708 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886186 4708 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886197 4708 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886203 4708 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886208 4708 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886214 4708 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886220 4708 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886225 4708 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886231 4708 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886236 4708 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886241 4708 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886246 4708 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886251 4708 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886256 4708 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886264 4708 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886269 4708 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886274 4708 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886279 4708 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886284 4708 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886292 4708 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.886302 4708 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886462 4708 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886475 4708 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886480 4708 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886486 4708 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886490 4708 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886497 4708 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886503 4708 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886510 4708 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886515 4708 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886522 4708 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886527 4708 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886532 4708 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886537 4708 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886542 4708 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886547 4708 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886552 4708 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886557 4708 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886562 4708 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886567 4708 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886572 4708 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886577 4708 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886582 4708 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886587 4708 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886592 4708 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886598 4708 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886603 4708 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886608 4708 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886613 4708 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886618 4708 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886623 4708 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886628 4708 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886633 4708 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886638 4708 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886643 4708 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886649 4708 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886654 4708 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886661 4708 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886691 4708 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886699 4708 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886704 4708 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886709 4708 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886715 4708 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886719 4708 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886724 4708 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886729 4708 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886734 4708 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886739 4708 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886745 4708 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886752 4708 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886758 4708 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886764 4708 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886769 4708 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886774 4708 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886779 4708 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886784 4708 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886790 4708 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886795 4708 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886799 4708 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886804 4708 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886809 4708 feature_gate.go:330] unrecognized feature gate: Example Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886813 4708 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886818 4708 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886823 4708 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886827 4708 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886832 4708 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886836 4708 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886841 4708 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886845 4708 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886849 4708 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886853 4708 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 16:00:45 crc kubenswrapper[4708]: W0320 16:00:45.886859 4708 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.886868 4708 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.888023 4708 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 16:00:45 crc kubenswrapper[4708]: E0320 16:00:45.892997 4708 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.896629 4708 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.896782 4708 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.898537 4708 server.go:997] "Starting client certificate rotation" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.898606 4708 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.899727 4708 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.947479 4708 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 16:00:45 crc kubenswrapper[4708]: E0320 16:00:45.950319 4708 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.952983 4708 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 16:00:45 crc kubenswrapper[4708]: I0320 16:00:45.968427 4708 log.go:25] "Validated CRI v1 runtime API" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.000968 4708 log.go:25] "Validated CRI v1 image API" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.003098 4708 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.008578 4708 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-15-55-39-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.008610 4708 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.022716 4708 manager.go:217] Machine: {Timestamp:2026-03-20 16:00:46.020224404 +0000 UTC m=+0.694561139 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:445dca2f-6b37-4b9f-94a5-2336a8fbca00 BootID:aab1a40b-9efc-47fe-9821-27ec8e6c1980 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:f2:22:6f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:f2:22:6f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:4d:51:4e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a2:53:84 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6a:82:4c Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:46:ed:87 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:66:0f:c4:2c:c8:46 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:9e:81:13:eb:7f:a2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.023008 4708 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.023164 4708 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.024375 4708 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.024555 4708 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.024594 4708 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.024846 4708 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.024856 4708 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.025486 4708 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.025518 4708 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.025972 4708 state_mem.go:36] "Initialized new in-memory state store" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.026080 4708 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.030060 4708 kubelet.go:418] "Attempting to sync node with API server" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.030085 4708 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.030114 4708 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.030130 4708 kubelet.go:324] "Adding apiserver pod source" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.030146 4708 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.036517 4708 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 16:00:46 crc kubenswrapper[4708]: W0320 16:00:46.037431 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.037514 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:46 crc kubenswrapper[4708]: W0320 16:00:46.037544 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.037701 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.037953 4708 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.040939 4708 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.044594 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.044620 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.044627 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.044635 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.044648 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.044656 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.044678 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.044691 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.044701 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.044711 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.044722 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.044728 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.045648 4708 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.046196 4708 server.go:1280] "Started kubelet" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.046552 4708 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.046642 4708 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 16:00:46 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.052634 4708 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.053090 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.055296 4708 server.go:460] "Adding debug handlers to kubelet server" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.055371 4708 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.055432 4708 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.056779 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.060725 4708 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.060793 4708 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.060854 4708 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 16:00:46 crc kubenswrapper[4708]: W0320 16:00:46.061854 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.062031 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" interval="200ms" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.061972 4708 factory.go:55] Registering systemd factory Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.062236 4708 factory.go:221] Registration of the systemd container factory successfully Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.062108 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.061191 4708 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.4:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e9803f826d3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,LastTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.062658 4708 factory.go:153] Registering CRI-O factory Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.063080 4708 factory.go:221] Registration of the crio container factory successfully Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.063279 4708 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.063407 4708 factory.go:103] Registering Raw factory Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.063493 4708 manager.go:1196] Started watching for new ooms in manager Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.064454 4708 manager.go:319] Starting recovery of all containers Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070474 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070530 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070561 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070572 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070581 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070592 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070601 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070611 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070624 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070633 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070642 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070652 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070661 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070736 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070747 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070758 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070766 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070775 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070784 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070796 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070807 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070818 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070828 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070839 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070849 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070861 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070875 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070910 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070920 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070928 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070937 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070947 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070957 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070986 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.070995 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071004 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071013 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071021 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071031 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071039 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071050 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071060 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071069 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071079 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071089 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071100 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071111 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071120 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071129 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071140 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071150 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071159 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071173 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071201 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071211 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071221 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071232 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071240 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071250 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071261 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071272 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071281 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071290 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071300 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071310 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071319 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071328 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071338 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071347 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071356 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071366 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071375 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071385 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071394 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071403 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071412 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071423 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071432 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071443 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071454 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071463 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071473 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071482 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071492 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071503 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071513 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071524 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071556 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071566 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071575 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071585 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071598 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071607 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071616 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071625 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071636 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071646 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071656 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071859 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071955 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071974 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.071995 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072013 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072034 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072072 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072090 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072108 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072125 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072143 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072156 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072175 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072188 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072198 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072209 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072221 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072232 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072244 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072255 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072268 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072279 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072291 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072301 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072312 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072324 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072337 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072347 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072360 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072372 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072383 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072393 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072405 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072419 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072431 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072441 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072452 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072462 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072475 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072485 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072496 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072507 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072517 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072528 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072540 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072550 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072562 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072574 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072592 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072603 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072616 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072625 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072634 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072645 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072656 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072733 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072744 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072757 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072767 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072777 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072807 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072818 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072828 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072839 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072894 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.072906 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078021 4708 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078116 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078164 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078182 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078205 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078244 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078279 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078299 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078356 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078381 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078420 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078439 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078457 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078495 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078516 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078540 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078582 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078600 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078618 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078653 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078733 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078757 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078800 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078823 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078839 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078876 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078895 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078912 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078930 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.078986 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079008 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079050 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079082 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079102 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079141 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079160 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079179 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079217 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079234 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079255 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079295 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079313 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079329 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079347 4708 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079388 4708 reconstruct.go:97] "Volume reconstruction finished" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.079401 4708 reconciler.go:26] "Reconciler: start to sync state" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.088271 4708 manager.go:324] Recovery completed Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.106937 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.107651 4708 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.109151 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.109214 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.109234 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.109616 4708 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.109705 4708 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.109741 4708 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.110210 4708 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.110491 4708 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.110517 4708 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.110543 4708 state_mem.go:36] "Initialized new in-memory state store" Mar 20 16:00:46 crc kubenswrapper[4708]: W0320 16:00:46.110717 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.110815 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.136411 4708 policy_none.go:49] "None policy: Start" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.138023 4708 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.138075 4708 state_mem.go:35] "Initializing new in-memory state store" Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.157097 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.211369 4708 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.213184 4708 manager.go:334] "Starting Device Plugin manager" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.213286 4708 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.213301 4708 server.go:79] "Starting device plugin registration server" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.213985 4708 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.214063 4708 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.214384 4708 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.214546 4708 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.214558 4708 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.222016 4708 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.296721 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" interval="400ms" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.314871 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.316820 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.316891 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.316918 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.316965 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.317543 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.4:6443: connect: connection refused" node="crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.412299 4708 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.412906 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.415154 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.415220 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.415236 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.415431 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.416096 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.416148 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.416624 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.416657 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.416689 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.416827 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.416911 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.416930 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.417274 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.417325 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.417345 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.417647 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.417679 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.417701 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.417710 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.417685 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.417766 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.417862 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.417996 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.418040 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.418880 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.418926 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.418946 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.419103 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.419129 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.419148 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.419159 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.419242 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.419266 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.420139 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.420167 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.420178 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.420324 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.420349 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.420859 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.420938 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.421018 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.420945 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.421144 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.421157 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.498484 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.498837 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.498993 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.499179 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.499312 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.499348 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.499379 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.499403 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.499422 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.499444 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.499476 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.499566 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.499727 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.499791 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.499832 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.518014 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.519130 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.519159 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.519167 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.519192 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.519558 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.4:6443: connect: connection refused" node="crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.601022 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.601663 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.601735 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.601998 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602051 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602094 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602128 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602160 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602192 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602218 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602250 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602247 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602327 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602325 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602304 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602376 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602279 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602413 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602423 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602442 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602464 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602510 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602551 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602558 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602596 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602624 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602662 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602701 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.602745 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.604317 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.697845 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" interval="800ms" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.772576 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.794450 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.802819 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.823557 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: W0320 16:00:46.824561 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b35cc8332eae483a5f63a6376928a62b3f3f94c5be5798205de8231c4c970154 WatchSource:0}: Error finding container b35cc8332eae483a5f63a6376928a62b3f3f94c5be5798205de8231c4c970154: Status 404 returned error can't find the container with id b35cc8332eae483a5f63a6376928a62b3f3f94c5be5798205de8231c4c970154 Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.829406 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:46 crc kubenswrapper[4708]: W0320 16:00:46.843176 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5a2d1233f2409315738d3fcfa489fd8b709d0e7da98d8564a980ef74100a0a84 WatchSource:0}: Error finding container 5a2d1233f2409315738d3fcfa489fd8b709d0e7da98d8564a980ef74100a0a84: Status 404 returned error can't find the container with id 5a2d1233f2409315738d3fcfa489fd8b709d0e7da98d8564a980ef74100a0a84 Mar 20 16:00:46 crc kubenswrapper[4708]: W0320 16:00:46.845027 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-d9e8f2f9ec6e0566b55ac2338298868d89c77174c366a79ad67bf4cfd977f6ef WatchSource:0}: Error finding container d9e8f2f9ec6e0566b55ac2338298868d89c77174c366a79ad67bf4cfd977f6ef: Status 404 returned error can't find the container with id d9e8f2f9ec6e0566b55ac2338298868d89c77174c366a79ad67bf4cfd977f6ef Mar 20 16:00:46 crc kubenswrapper[4708]: W0320 16:00:46.853090 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-1d240f7d8380e0332003320b7add6b0e6b3717e14733e2aa0e7dfa9b8d5e54fe WatchSource:0}: Error finding container 1d240f7d8380e0332003320b7add6b0e6b3717e14733e2aa0e7dfa9b8d5e54fe: Status 404 returned error can't find the container with id 1d240f7d8380e0332003320b7add6b0e6b3717e14733e2aa0e7dfa9b8d5e54fe Mar 20 16:00:46 crc kubenswrapper[4708]: W0320 16:00:46.854482 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-6304f149ff6a9523b7dc5269b5bfaac92b7e96c013cb4182bbd2f7fdc0b79dfb WatchSource:0}: Error finding container 6304f149ff6a9523b7dc5269b5bfaac92b7e96c013cb4182bbd2f7fdc0b79dfb: Status 404 returned error can't find the container with id 6304f149ff6a9523b7dc5269b5bfaac92b7e96c013cb4182bbd2f7fdc0b79dfb Mar 20 16:00:46 crc kubenswrapper[4708]: W0320 16:00:46.856005 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.856094 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.920541 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.922544 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.922625 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.922643 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:46 crc kubenswrapper[4708]: I0320 16:00:46.922708 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.923357 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.4:6443: connect: connection refused" node="crc" Mar 20 16:00:46 crc kubenswrapper[4708]: W0320 16:00:46.989831 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:46 crc kubenswrapper[4708]: E0320 16:00:46.989945 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:47 crc kubenswrapper[4708]: I0320 16:00:47.054520 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:47 crc kubenswrapper[4708]: I0320 16:00:47.116171 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6304f149ff6a9523b7dc5269b5bfaac92b7e96c013cb4182bbd2f7fdc0b79dfb"} Mar 20 16:00:47 crc kubenswrapper[4708]: I0320 16:00:47.117916 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1d240f7d8380e0332003320b7add6b0e6b3717e14733e2aa0e7dfa9b8d5e54fe"} Mar 20 16:00:47 crc kubenswrapper[4708]: I0320 16:00:47.119201 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d9e8f2f9ec6e0566b55ac2338298868d89c77174c366a79ad67bf4cfd977f6ef"} Mar 20 16:00:47 crc kubenswrapper[4708]: I0320 16:00:47.120869 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5a2d1233f2409315738d3fcfa489fd8b709d0e7da98d8564a980ef74100a0a84"} Mar 20 16:00:47 crc kubenswrapper[4708]: I0320 16:00:47.121990 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b35cc8332eae483a5f63a6376928a62b3f3f94c5be5798205de8231c4c970154"} Mar 20 16:00:47 crc kubenswrapper[4708]: E0320 16:00:47.499986 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" interval="1.6s" Mar 20 16:00:47 crc kubenswrapper[4708]: W0320 16:00:47.594297 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:47 crc kubenswrapper[4708]: E0320 16:00:47.594995 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:47 crc kubenswrapper[4708]: W0320 16:00:47.599720 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:47 crc kubenswrapper[4708]: E0320 16:00:47.599765 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:47 crc kubenswrapper[4708]: I0320 16:00:47.723556 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:47 crc kubenswrapper[4708]: I0320 16:00:47.725650 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:47 crc kubenswrapper[4708]: I0320 16:00:47.725714 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:47 crc kubenswrapper[4708]: I0320 16:00:47.725727 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:47 crc kubenswrapper[4708]: I0320 16:00:47.725760 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:00:47 crc kubenswrapper[4708]: E0320 16:00:47.726176 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.4:6443: connect: connection refused" node="crc" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.054517 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.081635 4708 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 16:00:48 crc kubenswrapper[4708]: E0320 16:00:48.086082 4708 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.127334 4708 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96" exitCode=0 Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.127422 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96"} Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.127577 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.129746 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.129789 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.129804 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.130880 4708 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b" exitCode=0 Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.130944 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b"} Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.131018 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.132411 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.132444 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.132458 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.134134 4708 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3" exitCode=0 Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.134202 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3"} Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.134310 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.136539 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.136599 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.136624 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.140323 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e"} Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.140378 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8"} Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.140405 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6"} Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.143483 4708 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5" exitCode=0 Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.143530 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5"} Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.143740 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.145103 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.145144 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.145163 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.149739 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.151612 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.151748 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:48 crc kubenswrapper[4708]: I0320 16:00:48.151771 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:48 crc kubenswrapper[4708]: E0320 16:00:48.302469 4708 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.4:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e9803f826d3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,LastTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:00:48 crc kubenswrapper[4708]: W0320 16:00:48.988003 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:48 crc kubenswrapper[4708]: E0320 16:00:48.988126 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.054075 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:49 crc kubenswrapper[4708]: E0320 16:00:49.101407 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" interval="3.2s" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.150712 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cf231961d33f62280cb7698c00ad3326849806ddc1d5bf010cb11f6cf5a27c01"} Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.150792 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.150797 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"67a83c494b01d6dcf1c0ff7f564cf91d62c6e0f19fc7d4301dc4cd3f5da38b58"} Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.150985 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bce00a794483a44bc8052d04362f63fd12b2abe8577d8f276aadb4609c40709f"} Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.152527 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.152615 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.152638 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.154935 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a"} Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.154987 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.155843 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.155878 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.155887 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.161549 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789"} Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.161578 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa"} Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.161590 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c"} Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.161600 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171"} Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.166362 4708 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493" exitCode=0 Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.166506 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493"} Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.166793 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.182246 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.182629 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.182691 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.184824 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8"} Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.184988 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.186483 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.186520 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.186537 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.326935 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.329114 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.329166 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.329183 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.329222 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:00:49 crc kubenswrapper[4708]: E0320 16:00:49.329729 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.4:6443: connect: connection refused" node="crc" Mar 20 16:00:49 crc kubenswrapper[4708]: W0320 16:00:49.361533 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:49 crc kubenswrapper[4708]: E0320 16:00:49.361624 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.552513 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:49 crc kubenswrapper[4708]: I0320 16:00:49.793190 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:00:49 crc kubenswrapper[4708]: W0320 16:00:49.951717 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:49 crc kubenswrapper[4708]: E0320 16:00:49.951847 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.4:6443: connect: connection refused" logger="UnhandledError" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.054445 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.4:6443: connect: connection refused Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.192283 4708 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb" exitCode=0 Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.192409 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb"} Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.192528 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.194021 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.194098 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.194119 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.194370 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.196138 4708 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="105d7eee768e1f9072dcac1ce1661080037ad575c8e0421708366f9f274a2077" exitCode=255 Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.196226 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.196285 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.196306 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.196352 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.196392 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"105d7eee768e1f9072dcac1ce1661080037ad575c8e0421708366f9f274a2077"} Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.197568 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.197616 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.197629 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.197725 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.197757 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.197768 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.197829 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.197867 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.197888 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.198443 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.198471 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.198484 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:50 crc kubenswrapper[4708]: I0320 16:00:50.198535 4708 scope.go:117] "RemoveContainer" containerID="105d7eee768e1f9072dcac1ce1661080037ad575c8e0421708366f9f274a2077" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.201959 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2"} Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.202020 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6"} Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.202037 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46"} Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.203315 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.204613 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.205125 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3cb502f5b3a8483599197c6a24a3959ff5dd3a0eaf398f92e39d8ec84728c989"} Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.205308 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.205334 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.205355 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.205375 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.205491 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.205505 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.206547 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.206571 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.206580 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.207299 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.207329 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.207339 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.469199 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.480546 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:51 crc kubenswrapper[4708]: I0320 16:00:51.584963 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.216100 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf"} Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.216179 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb"} Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.216218 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.216249 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.216228 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.216395 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.218041 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.218079 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.218091 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.218162 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.218204 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.218227 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.218447 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.218485 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.218507 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.257523 4708 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.340196 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.530710 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.532741 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.533041 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.533193 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:52 crc kubenswrapper[4708]: I0320 16:00:52.533368 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.221166 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.221832 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.221876 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.225173 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.225349 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.225431 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.225173 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.225604 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.225629 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.225224 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.225736 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.225756 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:53 crc kubenswrapper[4708]: I0320 16:00:53.926239 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:00:54 crc kubenswrapper[4708]: I0320 16:00:54.224264 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:54 crc kubenswrapper[4708]: I0320 16:00:54.225636 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:54 crc kubenswrapper[4708]: I0320 16:00:54.225731 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:54 crc kubenswrapper[4708]: I0320 16:00:54.225759 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.158085 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.158350 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.160428 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.160499 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.160533 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.226794 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.228336 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.228422 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.228450 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.503082 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.503320 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.504973 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.505018 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:55 crc kubenswrapper[4708]: I0320 16:00:55.505028 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:56 crc kubenswrapper[4708]: E0320 16:00:56.222160 4708 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:00:57 crc kubenswrapper[4708]: I0320 16:00:57.293040 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 16:00:57 crc kubenswrapper[4708]: I0320 16:00:57.294042 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:00:57 crc kubenswrapper[4708]: I0320 16:00:57.295780 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:00:57 crc kubenswrapper[4708]: I0320 16:00:57.295843 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:00:57 crc kubenswrapper[4708]: I0320 16:00:57.295862 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:00:58 crc kubenswrapper[4708]: I0320 16:00:58.159055 4708 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:00:58 crc kubenswrapper[4708]: I0320 16:00:58.159183 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 16:01:00 crc kubenswrapper[4708]: E0320 16:01:00.521856 4708 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e9803f826d3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,LastTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:00 crc kubenswrapper[4708]: I0320 16:01:00.522786 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z Mar 20 16:01:00 crc kubenswrapper[4708]: W0320 16:01:00.524364 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z Mar 20 16:01:00 crc kubenswrapper[4708]: E0320 16:01:00.524463 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:00 crc kubenswrapper[4708]: W0320 16:01:00.525399 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z Mar 20 16:01:00 crc kubenswrapper[4708]: E0320 16:01:00.525479 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:00 crc kubenswrapper[4708]: W0320 16:01:00.526960 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z Mar 20 16:01:00 crc kubenswrapper[4708]: E0320 16:01:00.527040 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:00 crc kubenswrapper[4708]: W0320 16:01:00.529932 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z Mar 20 16:01:00 crc kubenswrapper[4708]: E0320 16:01:00.530051 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:00 crc kubenswrapper[4708]: E0320 16:01:00.535039 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 16:01:00 crc kubenswrapper[4708]: E0320 16:01:00.535505 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 16:01:00 crc kubenswrapper[4708]: E0320 16:01:00.536938 4708 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:00 crc kubenswrapper[4708]: I0320 16:01:00.537308 4708 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 16:01:00 crc kubenswrapper[4708]: I0320 16:01:00.537406 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 16:01:00 crc kubenswrapper[4708]: I0320 16:01:00.543785 4708 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 16:01:00 crc kubenswrapper[4708]: I0320 16:01:00.543873 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.058537 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:01Z is after 2026-02-23T05:33:13Z Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.244828 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.245254 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.246747 4708 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3cb502f5b3a8483599197c6a24a3959ff5dd3a0eaf398f92e39d8ec84728c989" exitCode=255 Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.246800 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3cb502f5b3a8483599197c6a24a3959ff5dd3a0eaf398f92e39d8ec84728c989"} Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.246868 4708 scope.go:117] "RemoveContainer" containerID="105d7eee768e1f9072dcac1ce1661080037ad575c8e0421708366f9f274a2077" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.246997 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.247815 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.247859 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.247872 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.248501 4708 scope.go:117] "RemoveContainer" containerID="3cb502f5b3a8483599197c6a24a3959ff5dd3a0eaf398f92e39d8ec84728c989" Mar 20 16:01:01 crc kubenswrapper[4708]: E0320 16:01:01.248690 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.600771 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.600926 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.602186 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.602224 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:01 crc kubenswrapper[4708]: I0320 16:01:01.602236 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:02 crc kubenswrapper[4708]: I0320 16:01:02.056551 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:02Z is after 2026-02-23T05:33:13Z Mar 20 16:01:02 crc kubenswrapper[4708]: I0320 16:01:02.251944 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 16:01:02 crc kubenswrapper[4708]: I0320 16:01:02.347846 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:02 crc kubenswrapper[4708]: I0320 16:01:02.348183 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:02 crc kubenswrapper[4708]: I0320 16:01:02.350076 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:02 crc kubenswrapper[4708]: I0320 16:01:02.350126 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:02 crc kubenswrapper[4708]: I0320 16:01:02.350142 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:02 crc kubenswrapper[4708]: I0320 16:01:02.350793 4708 scope.go:117] "RemoveContainer" containerID="3cb502f5b3a8483599197c6a24a3959ff5dd3a0eaf398f92e39d8ec84728c989" Mar 20 16:01:02 crc kubenswrapper[4708]: E0320 16:01:02.351206 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:02 crc kubenswrapper[4708]: I0320 16:01:02.356967 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:03 crc kubenswrapper[4708]: I0320 16:01:03.057372 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:03Z is after 2026-02-23T05:33:13Z Mar 20 16:01:03 crc kubenswrapper[4708]: I0320 16:01:03.257615 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:03 crc kubenswrapper[4708]: I0320 16:01:03.258744 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:03 crc kubenswrapper[4708]: I0320 16:01:03.258842 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:03 crc kubenswrapper[4708]: I0320 16:01:03.258874 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:03 crc kubenswrapper[4708]: I0320 16:01:03.259884 4708 scope.go:117] "RemoveContainer" containerID="3cb502f5b3a8483599197c6a24a3959ff5dd3a0eaf398f92e39d8ec84728c989" Mar 20 16:01:03 crc kubenswrapper[4708]: E0320 16:01:03.260216 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:04 crc kubenswrapper[4708]: I0320 16:01:04.059304 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:04Z is after 2026-02-23T05:33:13Z Mar 20 16:01:04 crc kubenswrapper[4708]: W0320 16:01:04.251534 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:04Z is after 2026-02-23T05:33:13Z Mar 20 16:01:04 crc kubenswrapper[4708]: E0320 16:01:04.251742 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:05 crc kubenswrapper[4708]: I0320 16:01:05.056979 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:05Z is after 2026-02-23T05:33:13Z Mar 20 16:01:06 crc kubenswrapper[4708]: I0320 16:01:06.058109 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:06Z is after 2026-02-23T05:33:13Z Mar 20 16:01:06 crc kubenswrapper[4708]: E0320 16:01:06.222413 4708 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:01:06 crc kubenswrapper[4708]: I0320 16:01:06.935871 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:06 crc kubenswrapper[4708]: I0320 16:01:06.937870 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:06 crc kubenswrapper[4708]: I0320 16:01:06.937933 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:06 crc kubenswrapper[4708]: I0320 16:01:06.937948 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:06 crc kubenswrapper[4708]: I0320 16:01:06.937987 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:06 crc kubenswrapper[4708]: E0320 16:01:06.941538 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:06Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 16:01:06 crc kubenswrapper[4708]: E0320 16:01:06.944902 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:06Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.058442 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:07Z is after 2026-02-23T05:33:13Z Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.215828 4708 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.216071 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.217323 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.217376 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.217397 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.218187 4708 scope.go:117] "RemoveContainer" containerID="3cb502f5b3a8483599197c6a24a3959ff5dd3a0eaf398f92e39d8ec84728c989" Mar 20 16:01:07 crc kubenswrapper[4708]: E0320 16:01:07.218404 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.333133 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.333389 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.335067 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.335228 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.335254 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:07 crc kubenswrapper[4708]: I0320 16:01:07.354259 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 16:01:07 crc kubenswrapper[4708]: W0320 16:01:07.390227 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:07Z is after 2026-02-23T05:33:13Z Mar 20 16:01:07 crc kubenswrapper[4708]: E0320 16:01:07.390354 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:08 crc kubenswrapper[4708]: I0320 16:01:08.059165 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:08Z is after 2026-02-23T05:33:13Z Mar 20 16:01:08 crc kubenswrapper[4708]: I0320 16:01:08.159960 4708 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:01:08 crc kubenswrapper[4708]: I0320 16:01:08.160369 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:01:08 crc kubenswrapper[4708]: W0320 16:01:08.225377 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:08Z is after 2026-02-23T05:33:13Z Mar 20 16:01:08 crc kubenswrapper[4708]: E0320 16:01:08.225483 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:08 crc kubenswrapper[4708]: I0320 16:01:08.271225 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:08 crc kubenswrapper[4708]: I0320 16:01:08.272982 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:08 crc kubenswrapper[4708]: I0320 16:01:08.273060 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:08 crc kubenswrapper[4708]: I0320 16:01:08.273082 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:08 crc kubenswrapper[4708]: I0320 16:01:08.728617 4708 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 16:01:08 crc kubenswrapper[4708]: E0320 16:01:08.733530 4708 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:09 crc kubenswrapper[4708]: I0320 16:01:09.059555 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:09Z is after 2026-02-23T05:33:13Z Mar 20 16:01:10 crc kubenswrapper[4708]: I0320 16:01:10.056303 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:10Z is after 2026-02-23T05:33:13Z Mar 20 16:01:10 crc kubenswrapper[4708]: E0320 16:01:10.527453 4708 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:10Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e9803f826d3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,LastTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:11 crc kubenswrapper[4708]: I0320 16:01:11.058785 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:11Z is after 2026-02-23T05:33:13Z Mar 20 16:01:11 crc kubenswrapper[4708]: W0320 16:01:11.315161 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:11Z is after 2026-02-23T05:33:13Z Mar 20 16:01:11 crc kubenswrapper[4708]: E0320 16:01:11.315295 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:12 crc kubenswrapper[4708]: I0320 16:01:12.059948 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:12Z is after 2026-02-23T05:33:13Z Mar 20 16:01:13 crc kubenswrapper[4708]: I0320 16:01:13.059540 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:13Z is after 2026-02-23T05:33:13Z Mar 20 16:01:13 crc kubenswrapper[4708]: E0320 16:01:13.945785 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:13Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 16:01:13 crc kubenswrapper[4708]: I0320 16:01:13.945931 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:13 crc kubenswrapper[4708]: I0320 16:01:13.948194 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:13 crc kubenswrapper[4708]: I0320 16:01:13.948255 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:13 crc kubenswrapper[4708]: I0320 16:01:13.948314 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:13 crc kubenswrapper[4708]: I0320 16:01:13.948368 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:13 crc kubenswrapper[4708]: E0320 16:01:13.953997 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:13Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 16:01:14 crc kubenswrapper[4708]: I0320 16:01:14.059232 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:14Z is after 2026-02-23T05:33:13Z Mar 20 16:01:15 crc kubenswrapper[4708]: I0320 16:01:15.057927 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:15Z is after 2026-02-23T05:33:13Z Mar 20 16:01:15 crc kubenswrapper[4708]: W0320 16:01:15.531595 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:15Z is after 2026-02-23T05:33:13Z Mar 20 16:01:15 crc kubenswrapper[4708]: E0320 16:01:15.531740 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:16 crc kubenswrapper[4708]: I0320 16:01:16.058524 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:16Z is after 2026-02-23T05:33:13Z Mar 20 16:01:16 crc kubenswrapper[4708]: E0320 16:01:16.222582 4708 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:01:17 crc kubenswrapper[4708]: I0320 16:01:17.056854 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.060852 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:18Z is after 2026-02-23T05:33:13Z Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.159354 4708 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.159502 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.159608 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.159904 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.161896 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.161955 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.161981 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.162749 4708 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.163024 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8" gracePeriod=30 Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.303630 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.304165 4708 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8" exitCode=255 Mar 20 16:01:18 crc kubenswrapper[4708]: I0320 16:01:18.304423 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8"} Mar 20 16:01:19 crc kubenswrapper[4708]: I0320 16:01:19.061082 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:19Z is after 2026-02-23T05:33:13Z Mar 20 16:01:19 crc kubenswrapper[4708]: I0320 16:01:19.311496 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 16:01:19 crc kubenswrapper[4708]: I0320 16:01:19.312182 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3"} Mar 20 16:01:19 crc kubenswrapper[4708]: I0320 16:01:19.312431 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:19 crc kubenswrapper[4708]: I0320 16:01:19.314117 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:19 crc kubenswrapper[4708]: I0320 16:01:19.314220 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:19 crc kubenswrapper[4708]: I0320 16:01:19.314452 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:19 crc kubenswrapper[4708]: I0320 16:01:19.553398 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:20 crc kubenswrapper[4708]: I0320 16:01:20.059206 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:20Z is after 2026-02-23T05:33:13Z Mar 20 16:01:20 crc kubenswrapper[4708]: I0320 16:01:20.314739 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4708]: I0320 16:01:20.315762 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4708]: I0320 16:01:20.315797 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4708]: I0320 16:01:20.315810 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4708]: E0320 16:01:20.532770 4708 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:20Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e9803f826d3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,LastTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:20 crc kubenswrapper[4708]: W0320 16:01:20.593186 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:20Z is after 2026-02-23T05:33:13Z Mar 20 16:01:20 crc kubenswrapper[4708]: E0320 16:01:20.593307 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:20 crc kubenswrapper[4708]: E0320 16:01:20.949441 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:20Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 16:01:20 crc kubenswrapper[4708]: I0320 16:01:20.955089 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:20 crc kubenswrapper[4708]: I0320 16:01:20.956543 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:20 crc kubenswrapper[4708]: I0320 16:01:20.956640 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:20 crc kubenswrapper[4708]: I0320 16:01:20.956658 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:20 crc kubenswrapper[4708]: I0320 16:01:20.956715 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:20 crc kubenswrapper[4708]: E0320 16:01:20.960027 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:20Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.059050 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:21Z is after 2026-02-23T05:33:13Z Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.110381 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.112301 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.112364 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.112380 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.113161 4708 scope.go:117] "RemoveContainer" containerID="3cb502f5b3a8483599197c6a24a3959ff5dd3a0eaf398f92e39d8ec84728c989" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.319550 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.321911 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"38c50e977ead1084a0ad861a0da76dafba67a88c7651e6ee33853138d5b6f513"} Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.321975 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.322194 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.322893 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.322940 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.322954 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.323278 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.323321 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:21 crc kubenswrapper[4708]: I0320 16:01:21.323334 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:22 crc kubenswrapper[4708]: I0320 16:01:22.058906 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:22Z is after 2026-02-23T05:33:13Z Mar 20 16:01:22 crc kubenswrapper[4708]: I0320 16:01:22.329449 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 16:01:22 crc kubenswrapper[4708]: I0320 16:01:22.330505 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 16:01:22 crc kubenswrapper[4708]: I0320 16:01:22.332792 4708 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="38c50e977ead1084a0ad861a0da76dafba67a88c7651e6ee33853138d5b6f513" exitCode=255 Mar 20 16:01:22 crc kubenswrapper[4708]: I0320 16:01:22.332849 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"38c50e977ead1084a0ad861a0da76dafba67a88c7651e6ee33853138d5b6f513"} Mar 20 16:01:22 crc kubenswrapper[4708]: I0320 16:01:22.332905 4708 scope.go:117] "RemoveContainer" containerID="3cb502f5b3a8483599197c6a24a3959ff5dd3a0eaf398f92e39d8ec84728c989" Mar 20 16:01:22 crc kubenswrapper[4708]: I0320 16:01:22.333170 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:22 crc kubenswrapper[4708]: I0320 16:01:22.334483 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:22 crc kubenswrapper[4708]: I0320 16:01:22.334537 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:22 crc kubenswrapper[4708]: I0320 16:01:22.334555 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:22 crc kubenswrapper[4708]: I0320 16:01:22.335450 4708 scope.go:117] "RemoveContainer" containerID="38c50e977ead1084a0ad861a0da76dafba67a88c7651e6ee33853138d5b6f513" Mar 20 16:01:22 crc kubenswrapper[4708]: E0320 16:01:22.336367 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:23 crc kubenswrapper[4708]: I0320 16:01:23.057755 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:23Z is after 2026-02-23T05:33:13Z Mar 20 16:01:23 crc kubenswrapper[4708]: I0320 16:01:23.339767 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 16:01:24 crc kubenswrapper[4708]: I0320 16:01:24.058378 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:24Z is after 2026-02-23T05:33:13Z Mar 20 16:01:25 crc kubenswrapper[4708]: I0320 16:01:25.056816 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:25Z is after 2026-02-23T05:33:13Z Mar 20 16:01:25 crc kubenswrapper[4708]: I0320 16:01:25.059868 4708 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 16:01:25 crc kubenswrapper[4708]: E0320 16:01:25.064361 4708 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:25Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:25 crc kubenswrapper[4708]: E0320 16:01:25.065514 4708 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 20 16:01:25 crc kubenswrapper[4708]: I0320 16:01:25.158558 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:25 crc kubenswrapper[4708]: I0320 16:01:25.158795 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:25 crc kubenswrapper[4708]: I0320 16:01:25.160179 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:25 crc kubenswrapper[4708]: I0320 16:01:25.160216 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:25 crc kubenswrapper[4708]: I0320 16:01:25.160229 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:26 crc kubenswrapper[4708]: I0320 16:01:26.057098 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:26Z is after 2026-02-23T05:33:13Z Mar 20 16:01:26 crc kubenswrapper[4708]: E0320 16:01:26.222847 4708 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:01:26 crc kubenswrapper[4708]: I0320 16:01:26.887293 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:26 crc kubenswrapper[4708]: I0320 16:01:26.889099 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:26 crc kubenswrapper[4708]: I0320 16:01:26.890942 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:26 crc kubenswrapper[4708]: I0320 16:01:26.891001 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:26 crc kubenswrapper[4708]: I0320 16:01:26.891019 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:26 crc kubenswrapper[4708]: I0320 16:01:26.892634 4708 scope.go:117] "RemoveContainer" containerID="38c50e977ead1084a0ad861a0da76dafba67a88c7651e6ee33853138d5b6f513" Mar 20 16:01:26 crc kubenswrapper[4708]: E0320 16:01:26.892926 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:27 crc kubenswrapper[4708]: I0320 16:01:27.057469 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:27Z is after 2026-02-23T05:33:13Z Mar 20 16:01:27 crc kubenswrapper[4708]: I0320 16:01:27.216370 4708 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:27 crc kubenswrapper[4708]: I0320 16:01:27.353296 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:27 crc kubenswrapper[4708]: I0320 16:01:27.354323 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:27 crc kubenswrapper[4708]: I0320 16:01:27.354366 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:27 crc kubenswrapper[4708]: I0320 16:01:27.354379 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:27 crc kubenswrapper[4708]: I0320 16:01:27.355115 4708 scope.go:117] "RemoveContainer" containerID="38c50e977ead1084a0ad861a0da76dafba67a88c7651e6ee33853138d5b6f513" Mar 20 16:01:27 crc kubenswrapper[4708]: E0320 16:01:27.355318 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:27 crc kubenswrapper[4708]: E0320 16:01:27.955930 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:27Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 16:01:27 crc kubenswrapper[4708]: I0320 16:01:27.961061 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:27 crc kubenswrapper[4708]: I0320 16:01:27.963109 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:27 crc kubenswrapper[4708]: I0320 16:01:27.963174 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:27 crc kubenswrapper[4708]: I0320 16:01:27.963194 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:27 crc kubenswrapper[4708]: I0320 16:01:27.963233 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:27 crc kubenswrapper[4708]: E0320 16:01:27.968458 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:27Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 16:01:28 crc kubenswrapper[4708]: I0320 16:01:28.059347 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:28Z is after 2026-02-23T05:33:13Z Mar 20 16:01:28 crc kubenswrapper[4708]: I0320 16:01:28.159597 4708 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:01:28 crc kubenswrapper[4708]: I0320 16:01:28.159710 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:01:29 crc kubenswrapper[4708]: I0320 16:01:29.057988 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:29Z is after 2026-02-23T05:33:13Z Mar 20 16:01:29 crc kubenswrapper[4708]: W0320 16:01:29.479648 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:29Z is after 2026-02-23T05:33:13Z Mar 20 16:01:29 crc kubenswrapper[4708]: E0320 16:01:29.479811 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:30 crc kubenswrapper[4708]: I0320 16:01:30.059392 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:30Z is after 2026-02-23T05:33:13Z Mar 20 16:01:30 crc kubenswrapper[4708]: W0320 16:01:30.234464 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:30Z is after 2026-02-23T05:33:13Z Mar 20 16:01:30 crc kubenswrapper[4708]: E0320 16:01:30.234586 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 16:01:30 crc kubenswrapper[4708]: E0320 16:01:30.539095 4708 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e9803f826d3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,LastTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:31 crc kubenswrapper[4708]: I0320 16:01:31.057982 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:32 crc kubenswrapper[4708]: W0320 16:01:32.037889 4708 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 16:01:32 crc kubenswrapper[4708]: E0320 16:01:32.038017 4708 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 16:01:32 crc kubenswrapper[4708]: I0320 16:01:32.058854 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:33 crc kubenswrapper[4708]: I0320 16:01:33.058497 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:34 crc kubenswrapper[4708]: I0320 16:01:34.061272 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:34 crc kubenswrapper[4708]: E0320 16:01:34.965136 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 16:01:34 crc kubenswrapper[4708]: I0320 16:01:34.969129 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:34 crc kubenswrapper[4708]: I0320 16:01:34.971038 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:34 crc kubenswrapper[4708]: I0320 16:01:34.971104 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:34 crc kubenswrapper[4708]: I0320 16:01:34.971131 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:34 crc kubenswrapper[4708]: I0320 16:01:34.971180 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:34 crc kubenswrapper[4708]: E0320 16:01:34.977357 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 16:01:35 crc kubenswrapper[4708]: I0320 16:01:35.060841 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:36 crc kubenswrapper[4708]: I0320 16:01:36.059481 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:36 crc kubenswrapper[4708]: E0320 16:01:36.223965 4708 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:01:37 crc kubenswrapper[4708]: I0320 16:01:37.058784 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:38 crc kubenswrapper[4708]: I0320 16:01:38.060384 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:38 crc kubenswrapper[4708]: I0320 16:01:38.160342 4708 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:01:38 crc kubenswrapper[4708]: I0320 16:01:38.160449 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 16:01:39 crc kubenswrapper[4708]: I0320 16:01:39.061537 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:39 crc kubenswrapper[4708]: I0320 16:01:39.799471 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 16:01:39 crc kubenswrapper[4708]: I0320 16:01:39.799685 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:39 crc kubenswrapper[4708]: I0320 16:01:39.801127 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:39 crc kubenswrapper[4708]: I0320 16:01:39.801192 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:39 crc kubenswrapper[4708]: I0320 16:01:39.801212 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:40 crc kubenswrapper[4708]: I0320 16:01:40.059545 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:40 crc kubenswrapper[4708]: I0320 16:01:40.110185 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:40 crc kubenswrapper[4708]: I0320 16:01:40.112131 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:40 crc kubenswrapper[4708]: I0320 16:01:40.112210 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:40 crc kubenswrapper[4708]: I0320 16:01:40.112229 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:40 crc kubenswrapper[4708]: I0320 16:01:40.113009 4708 scope.go:117] "RemoveContainer" containerID="38c50e977ead1084a0ad861a0da76dafba67a88c7651e6ee33853138d5b6f513" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.113220 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.549911 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803f826d3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,LastTimestamp:2026-03-20 16:00:46.046172111 +0000 UTC m=+0.720508826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.554567 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe87358 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109193048 +0000 UTC m=+0.783529773,LastTimestamp:2026-03-20 16:00:46.109193048 +0000 UTC m=+0.783529773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.562435 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe8f813 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109227027 +0000 UTC m=+0.783563762,LastTimestamp:2026-03-20 16:00:46.109227027 +0000 UTC m=+0.783563762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.567319 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe9353e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109242686 +0000 UTC m=+0.783579421,LastTimestamp:2026-03-20 16:00:46.109242686 +0000 UTC m=+0.783579421,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.571404 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e980402470ec6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.216056518 +0000 UTC m=+0.890393233,LastTimestamp:2026-03-20 16:00:46.216056518 +0000 UTC m=+0.890393233,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.576099 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe87358\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe87358 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109193048 +0000 UTC m=+0.783529773,LastTimestamp:2026-03-20 16:00:46.316865452 +0000 UTC m=+0.991202207,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.581167 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe8f813\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe8f813 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109227027 +0000 UTC m=+0.783563762,LastTimestamp:2026-03-20 16:00:46.31690855 +0000 UTC m=+0.991245305,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.585739 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe9353e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe9353e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109242686 +0000 UTC m=+0.783579421,LastTimestamp:2026-03-20 16:00:46.316932369 +0000 UTC m=+0.991269124,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.591193 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe87358\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe87358 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109193048 +0000 UTC m=+0.783529773,LastTimestamp:2026-03-20 16:00:46.415196299 +0000 UTC m=+1.089533024,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.596835 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe8f813\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe8f813 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109227027 +0000 UTC m=+0.783563762,LastTimestamp:2026-03-20 16:00:46.41522861 +0000 UTC m=+1.089565335,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.600906 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe9353e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe9353e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109242686 +0000 UTC m=+0.783579421,LastTimestamp:2026-03-20 16:00:46.4152425 +0000 UTC m=+1.089579225,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.606409 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe87358\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe87358 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109193048 +0000 UTC m=+0.783529773,LastTimestamp:2026-03-20 16:00:46.416648373 +0000 UTC m=+1.090985088,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.610379 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe8f813\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe8f813 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109227027 +0000 UTC m=+0.783563762,LastTimestamp:2026-03-20 16:00:46.416683884 +0000 UTC m=+1.091020599,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.616063 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe9353e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe9353e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109242686 +0000 UTC m=+0.783579421,LastTimestamp:2026-03-20 16:00:46.416696044 +0000 UTC m=+1.091032759,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.620866 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe87358\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe87358 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109193048 +0000 UTC m=+0.783529773,LastTimestamp:2026-03-20 16:00:46.417305118 +0000 UTC m=+1.091641863,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.624992 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe8f813\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe8f813 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109227027 +0000 UTC m=+0.783563762,LastTimestamp:2026-03-20 16:00:46.417337919 +0000 UTC m=+1.091674674,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.629280 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe9353e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe9353e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109242686 +0000 UTC m=+0.783579421,LastTimestamp:2026-03-20 16:00:46.417355919 +0000 UTC m=+1.091692664,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.633445 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe87358\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe87358 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109193048 +0000 UTC m=+0.783529773,LastTimestamp:2026-03-20 16:00:46.417662267 +0000 UTC m=+1.091998982,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.637863 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe87358\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe87358 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109193048 +0000 UTC m=+0.783529773,LastTimestamp:2026-03-20 16:00:46.417696388 +0000 UTC m=+1.092033093,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.647026 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe8f813\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe8f813 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109227027 +0000 UTC m=+0.783563762,LastTimestamp:2026-03-20 16:00:46.417706818 +0000 UTC m=+1.092043523,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.650866 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe9353e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe9353e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109242686 +0000 UTC m=+0.783579421,LastTimestamp:2026-03-20 16:00:46.417715868 +0000 UTC m=+1.092052583,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.654921 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe8f813\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe8f813 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109227027 +0000 UTC m=+0.783563762,LastTimestamp:2026-03-20 16:00:46.417757649 +0000 UTC m=+1.092094354,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.659791 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe9353e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe9353e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109242686 +0000 UTC m=+0.783579421,LastTimestamp:2026-03-20 16:00:46.41777432 +0000 UTC m=+1.092111035,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.664359 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe87358\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe87358 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109193048 +0000 UTC m=+0.783529773,LastTimestamp:2026-03-20 16:00:46.418904245 +0000 UTC m=+1.093240970,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.668488 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9803fbe8f813\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9803fbe8f813 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.109227027 +0000 UTC m=+0.783563762,LastTimestamp:2026-03-20 16:00:46.418938347 +0000 UTC m=+1.093275072,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.674383 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9804272ec98a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.835222922 +0000 UTC m=+1.509559637,LastTimestamp:2026-03-20 16:00:46.835222922 +0000 UTC m=+1.509559637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.678175 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980427d00a63 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.845790819 +0000 UTC m=+1.520127534,LastTimestamp:2026-03-20 16:00:46.845790819 +0000 UTC m=+1.520127534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.682124 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e980428454252 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.85347285 +0000 UTC m=+1.527809575,LastTimestamp:2026-03-20 16:00:46.85347285 +0000 UTC m=+1.527809575,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.686028 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980428888a29 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.857882153 +0000 UTC m=+1.532218888,LastTimestamp:2026-03-20 16:00:46.857882153 +0000 UTC m=+1.532218888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.690665 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804288bb686 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:46.858090118 +0000 UTC m=+1.532426823,LastTimestamp:2026-03-20 16:00:46.858090118 +0000 UTC m=+1.532426823,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.696991 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e98044e1ca134 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.488344372 +0000 UTC m=+2.162681087,LastTimestamp:2026-03-20 16:00:47.488344372 +0000 UTC m=+2.162681087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.701163 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e98044e1d7a97 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.488400023 +0000 UTC m=+2.162736748,LastTimestamp:2026-03-20 16:00:47.488400023 +0000 UTC m=+2.162736748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.706061 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e98044e6c95a7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.493584295 +0000 UTC m=+2.167921020,LastTimestamp:2026-03-20 16:00:47.493584295 +0000 UTC m=+2.167921020,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.709806 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e98044e817f9b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.494954907 +0000 UTC m=+2.169291642,LastTimestamp:2026-03-20 16:00:47.494954907 +0000 UTC m=+2.169291642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.713927 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e98044e8cd9bb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.495698875 +0000 UTC m=+2.170035600,LastTimestamp:2026-03-20 16:00:47.495698875 +0000 UTC m=+2.170035600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.718599 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e98044ed541e8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.500444136 +0000 UTC m=+2.174780871,LastTimestamp:2026-03-20 16:00:47.500444136 +0000 UTC m=+2.174780871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.722840 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e98044eefd559 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.502185817 +0000 UTC m=+2.176522532,LastTimestamp:2026-03-20 16:00:47.502185817 +0000 UTC m=+2.176522532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.726747 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e98044f079f87 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.503744903 +0000 UTC m=+2.178081618,LastTimestamp:2026-03-20 16:00:47.503744903 +0000 UTC m=+2.178081618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.731324 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e98044fcac395 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.516533653 +0000 UTC m=+2.190870368,LastTimestamp:2026-03-20 16:00:47.516533653 +0000 UTC m=+2.190870368,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.736291 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e98044fe54432 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.518270514 +0000 UTC m=+2.192607229,LastTimestamp:2026-03-20 16:00:47.518270514 +0000 UTC m=+2.192607229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.740826 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e98044fe7bafe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.518431998 +0000 UTC m=+2.192768713,LastTimestamp:2026-03-20 16:00:47.518431998 +0000 UTC m=+2.192768713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.745545 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980461907533 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.814702387 +0000 UTC m=+2.489039112,LastTimestamp:2026-03-20 16:00:47.814702387 +0000 UTC m=+2.489039112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.750843 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9804626c40e4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.829106916 +0000 UTC m=+2.503443661,LastTimestamp:2026-03-20 16:00:47.829106916 +0000 UTC m=+2.503443661,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.756019 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980462924a43 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.831599683 +0000 UTC m=+2.505936398,LastTimestamp:2026-03-20 16:00:47.831599683 +0000 UTC m=+2.505936398,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.761483 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e98046fa91dd5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.051199445 +0000 UTC m=+2.725536160,LastTimestamp:2026-03-20 16:00:48.051199445 +0000 UTC m=+2.725536160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.771080 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980470806386 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.065307526 +0000 UTC m=+2.739644241,LastTimestamp:2026-03-20 16:00:48.065307526 +0000 UTC m=+2.739644241,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.777373 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9804709c0d0f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.067120399 +0000 UTC m=+2.741457114,LastTimestamp:2026-03-20 16:00:48.067120399 +0000 UTC m=+2.741457114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.782213 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9804747b551e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.132085022 +0000 UTC m=+2.806421747,LastTimestamp:2026-03-20 16:00:48.132085022 +0000 UTC m=+2.806421747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.790090 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e980474ba3079 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.136204409 +0000 UTC m=+2.810541144,LastTimestamp:2026-03-20 16:00:48.136204409 +0000 UTC m=+2.810541144,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.796552 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980474e24205 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.138830341 +0000 UTC m=+2.813167096,LastTimestamp:2026-03-20 16:00:48.138830341 +0000 UTC m=+2.813167096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.801598 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980475851c10 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.149502992 +0000 UTC m=+2.823839727,LastTimestamp:2026-03-20 16:00:48.149502992 +0000 UTC m=+2.823839727,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.807007 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e98047f32c15c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.31187798 +0000 UTC m=+2.986214695,LastTimestamp:2026-03-20 16:00:48.31187798 +0000 UTC m=+2.986214695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.812300 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980480b4615f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.337150303 +0000 UTC m=+3.011487018,LastTimestamp:2026-03-20 16:00:48.337150303 +0000 UTC m=+3.011487018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.819110 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980482f2d831 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.374798385 +0000 UTC m=+3.049135090,LastTimestamp:2026-03-20 16:00:48.374798385 +0000 UTC m=+3.049135090,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.826704 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9804835e962b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.381859371 +0000 UTC m=+3.056196096,LastTimestamp:2026-03-20 16:00:48.381859371 +0000 UTC m=+3.056196096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.830855 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e980483617300 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.382046976 +0000 UTC m=+3.056383691,LastTimestamp:2026-03-20 16:00:48.382046976 +0000 UTC m=+3.056383691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.834995 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980483af2dbf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.387141055 +0000 UTC m=+3.061477770,LastTimestamp:2026-03-20 16:00:48.387141055 +0000 UTC m=+3.061477770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.839016 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980483bdb82e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.388093998 +0000 UTC m=+3.062430713,LastTimestamp:2026-03-20 16:00:48.388093998 +0000 UTC m=+3.062430713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.844683 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980483d46c09 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.389581833 +0000 UTC m=+3.063918548,LastTimestamp:2026-03-20 16:00:48.389581833 +0000 UTC m=+3.063918548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.854087 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e98048481d33b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.400945979 +0000 UTC m=+3.075282704,LastTimestamp:2026-03-20 16:00:48.400945979 +0000 UTC m=+3.075282704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.859863 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804855e5076 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.415395958 +0000 UTC m=+3.089732693,LastTimestamp:2026-03-20 16:00:48.415395958 +0000 UTC m=+3.089732693,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.864412 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804857757ff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.417036287 +0000 UTC m=+3.091373002,LastTimestamp:2026-03-20 16:00:48.417036287 +0000 UTC m=+3.091373002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.868465 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e980485afdc89 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.420740233 +0000 UTC m=+3.095076958,LastTimestamp:2026-03-20 16:00:48.420740233 +0000 UTC m=+3.095076958,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.873237 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980491085bbb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.611089339 +0000 UTC m=+3.285426054,LastTimestamp:2026-03-20 16:00:48.611089339 +0000 UTC m=+3.285426054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.879429 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980491ff0988 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.627255688 +0000 UTC m=+3.301592443,LastTimestamp:2026-03-20 16:00:48.627255688 +0000 UTC m=+3.301592443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.883797 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e980492227890 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.629577872 +0000 UTC m=+3.303914587,LastTimestamp:2026-03-20 16:00:48.629577872 +0000 UTC m=+3.303914587,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.889023 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e98049235a3d4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.630834132 +0000 UTC m=+3.305170847,LastTimestamp:2026-03-20 16:00:48.630834132 +0000 UTC m=+3.305170847,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.896378 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804935ff2bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.65038406 +0000 UTC m=+3.324720775,LastTimestamp:2026-03-20 16:00:48.65038406 +0000 UTC m=+3.324720775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.901981 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980493716b89 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.651529097 +0000 UTC m=+3.325865812,LastTimestamp:2026-03-20 16:00:48.651529097 +0000 UTC m=+3.325865812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.906452 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e98049e78dc3e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.836566078 +0000 UTC m=+3.510902803,LastTimestamp:2026-03-20 16:00:48.836566078 +0000 UTC m=+3.510902803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.917526 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e98049fa5fe5e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.85630115 +0000 UTC m=+3.530637865,LastTimestamp:2026-03-20 16:00:48.85630115 +0000 UTC m=+3.530637865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.924042 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e98049fdc6d8b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.859868555 +0000 UTC m=+3.534205270,LastTimestamp:2026-03-20 16:00:48.859868555 +0000 UTC m=+3.534205270,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.929356 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804a17afe70 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.887037552 +0000 UTC m=+3.561374307,LastTimestamp:2026-03-20 16:00:48.887037552 +0000 UTC m=+3.561374307,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.933452 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804a1945d28 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:48.8887002 +0000 UTC m=+3.563036935,LastTimestamp:2026-03-20 16:00:48.8887002 +0000 UTC m=+3.563036935,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.937761 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804ad5f923c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:49.086566972 +0000 UTC m=+3.760903697,LastTimestamp:2026-03-20 16:00:49.086566972 +0000 UTC m=+3.760903697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.943257 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804b3111a03 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:49.182087683 +0000 UTC m=+3.856424438,LastTimestamp:2026-03-20 16:00:49.182087683 +0000 UTC m=+3.856424438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.950617 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804b32c4f7c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:49.183870844 +0000 UTC m=+3.858207569,LastTimestamp:2026-03-20 16:00:49.183870844 +0000 UTC m=+3.858207569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.956115 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9804b3568cc9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:49.186639049 +0000 UTC m=+3.860975764,LastTimestamp:2026-03-20 16:00:49.186639049 +0000 UTC m=+3.860975764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.961263 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804c1350143 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:49.419321667 +0000 UTC m=+4.093658382,LastTimestamp:2026-03-20 16:00:49.419321667 +0000 UTC m=+4.093658382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.965717 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9804c1b29d9b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:49.427553691 +0000 UTC m=+4.101890406,LastTimestamp:2026-03-20 16:00:49.427553691 +0000 UTC m=+4.101890406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.969861 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804c27230e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:49.440108775 +0000 UTC m=+4.114445480,LastTimestamp:2026-03-20 16:00:49.440108775 +0000 UTC m=+4.114445480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.978423 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9804c2c9d7ca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:49.44585313 +0000 UTC m=+4.120189845,LastTimestamp:2026-03-20 16:00:49.44585313 +0000 UTC m=+4.120189845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.983325 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9804ef939954 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:50.197272916 +0000 UTC m=+4.871609651,LastTimestamp:2026-03-20 16:00:50.197272916 +0000 UTC m=+4.871609651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.988286 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9804b32c4f7c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804b32c4f7c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:49.183870844 +0000 UTC m=+3.858207569,LastTimestamp:2026-03-20 16:00:50.199794806 +0000 UTC m=+4.874131521,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:40 crc kubenswrapper[4708]: E0320 16:01:40.994617 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9804c1350143\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804c1350143 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:49.419321667 +0000 UTC m=+4.093658382,LastTimestamp:2026-03-20 16:00:50.412227939 +0000 UTC m=+5.086564654,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.003979 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9804fc8b4cef openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:50.414832879 +0000 UTC m=+5.089169594,LastTimestamp:2026-03-20 16:00:50.414832879 +0000 UTC m=+5.089169594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.010752 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9804c27230e7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9804c27230e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:49.440108775 +0000 UTC m=+4.114445480,LastTimestamp:2026-03-20 16:00:50.432151846 +0000 UTC m=+5.106488551,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.012841 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9804fe240d49 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:50.441620809 +0000 UTC m=+5.115957524,LastTimestamp:2026-03-20 16:00:50.441620809 +0000 UTC m=+5.115957524,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.020904 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9804fe424cdd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:50.443603165 +0000 UTC m=+5.117939880,LastTimestamp:2026-03-20 16:00:50.443603165 +0000 UTC m=+5.117939880,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.029392 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e98050b3d09a3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:50.661362083 +0000 UTC m=+5.335698808,LastTimestamp:2026-03-20 16:00:50.661362083 +0000 UTC m=+5.335698808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.034109 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e98050ead2f1c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:50.719043356 +0000 UTC m=+5.393380071,LastTimestamp:2026-03-20 16:00:50.719043356 +0000 UTC m=+5.393380071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.041124 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e98050ec43bcc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:50.720553932 +0000 UTC m=+5.394890687,LastTimestamp:2026-03-20 16:00:50.720553932 +0000 UTC m=+5.394890687,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.045773 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e98051d7fdcbd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:50.967731389 +0000 UTC m=+5.642068104,LastTimestamp:2026-03-20 16:00:50.967731389 +0000 UTC m=+5.642068104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.049889 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e98051e790494 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:50.984060052 +0000 UTC m=+5.658396767,LastTimestamp:2026-03-20 16:00:50.984060052 +0000 UTC m=+5.658396767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.058378 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e98051e920691 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:50.985698961 +0000 UTC m=+5.660035676,LastTimestamp:2026-03-20 16:00:50.985698961 +0000 UTC m=+5.660035676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: I0320 16:01:41.058687 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.064172 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e98052b56d578 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:51.199923576 +0000 UTC m=+5.874260291,LastTimestamp:2026-03-20 16:00:51.199923576 +0000 UTC m=+5.874260291,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.071340 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e98052c7e37e2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:51.21928189 +0000 UTC m=+5.893618605,LastTimestamp:2026-03-20 16:00:51.21928189 +0000 UTC m=+5.893618605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.082068 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e98052c900b28 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:51.220450088 +0000 UTC m=+5.894786793,LastTimestamp:2026-03-20 16:00:51.220450088 +0000 UTC m=+5.894786793,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.088284 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9805376be1e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:51.402629601 +0000 UTC m=+6.076966316,LastTimestamp:2026-03-20 16:00:51.402629601 +0000 UTC m=+6.076966316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.093110 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e98053856912e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:51.418009902 +0000 UTC m=+6.092346617,LastTimestamp:2026-03-20 16:00:51.418009902 +0000 UTC m=+6.092346617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.100438 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 16:01:41 crc kubenswrapper[4708]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9806ca24321e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 16:01:41 crc kubenswrapper[4708]: body: Mar 20 16:01:41 crc kubenswrapper[4708]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:58.159149598 +0000 UTC m=+12.833486343,LastTimestamp:2026-03-20 16:00:58.159149598 +0000 UTC m=+12.833486343,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:01:41 crc kubenswrapper[4708]: > Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.105141 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9806ca257932 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:58.15923333 +0000 UTC m=+12.833570055,LastTimestamp:2026-03-20 16:00:58.15923333 +0000 UTC m=+12.833570055,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.109282 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 16:01:41 crc kubenswrapper[4708]: &Event{ObjectMeta:{kube-apiserver-crc.189e980757e51f9a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 16:01:41 crc kubenswrapper[4708]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 16:01:41 crc kubenswrapper[4708]: Mar 20 16:01:41 crc kubenswrapper[4708]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:00.537380762 +0000 UTC m=+15.211717487,LastTimestamp:2026-03-20 16:01:00.537380762 +0000 UTC m=+15.211717487,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:01:41 crc kubenswrapper[4708]: > Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.113180 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980757e6048e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:00.537439374 +0000 UTC m=+15.211776089,LastTimestamp:2026-03-20 16:01:00.537439374 +0000 UTC m=+15.211776089,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.115219 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e980757e51f9a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 16:01:41 crc kubenswrapper[4708]: &Event{ObjectMeta:{kube-apiserver-crc.189e980757e51f9a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 16:01:41 crc kubenswrapper[4708]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 16:01:41 crc kubenswrapper[4708]: Mar 20 16:01:41 crc kubenswrapper[4708]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:00.537380762 +0000 UTC m=+15.211717487,LastTimestamp:2026-03-20 16:01:00.543851159 +0000 UTC m=+15.218187874,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:01:41 crc kubenswrapper[4708]: > Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.116799 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e980757e6048e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e980757e6048e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:00.537439374 +0000 UTC m=+15.211776089,LastTimestamp:2026-03-20 16:01:00.543904691 +0000 UTC m=+15.218241406,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.118922 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 16:01:41 crc kubenswrapper[4708]: &Event{ObjectMeta:{kube-controller-manager-crc.189e98091e3e3d5a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 16:01:41 crc kubenswrapper[4708]: body: Mar 20 16:01:41 crc kubenswrapper[4708]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:08.160077146 +0000 UTC m=+22.834413881,LastTimestamp:2026-03-20 16:01:08.160077146 +0000 UTC m=+22.834413881,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:01:41 crc kubenswrapper[4708]: > Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.129739 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e98091e43d38d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:08.160443277 +0000 UTC m=+22.834779992,LastTimestamp:2026-03-20 16:01:08.160443277 +0000 UTC m=+22.834779992,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.131528 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e98091e3e3d5a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 16:01:41 crc kubenswrapper[4708]: &Event{ObjectMeta:{kube-controller-manager-crc.189e98091e3e3d5a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 16:01:41 crc kubenswrapper[4708]: body: Mar 20 16:01:41 crc kubenswrapper[4708]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:08.160077146 +0000 UTC m=+22.834413881,LastTimestamp:2026-03-20 16:01:18.159454111 +0000 UTC m=+32.833790866,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:01:41 crc kubenswrapper[4708]: > Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.137810 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e98091e43d38d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e98091e43d38d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:08.160443277 +0000 UTC m=+22.834779992,LastTimestamp:2026-03-20 16:01:18.159553844 +0000 UTC m=+32.833890599,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.143539 4708 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980b7276aa42 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:18.162995778 +0000 UTC m=+32.837332533,LastTimestamp:2026-03-20 16:01:18.162995778 +0000 UTC m=+32.837332533,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.148624 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e98044f079f87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e98044f079f87 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.503744903 +0000 UTC m=+2.178081618,LastTimestamp:2026-03-20 16:01:18.296965232 +0000 UTC m=+32.971301957,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.154191 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e980461907533\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e980461907533 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.814702387 +0000 UTC m=+2.489039112,LastTimestamp:2026-03-20 16:01:18.535789447 +0000 UTC m=+33.210126172,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.159252 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9804626c40e4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9804626c40e4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:47.829106916 +0000 UTC m=+2.503443661,LastTimestamp:2026-03-20 16:01:18.547769354 +0000 UTC m=+33.222106109,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.168249 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e98091e3e3d5a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 16:01:41 crc kubenswrapper[4708]: &Event{ObjectMeta:{kube-controller-manager-crc.189e98091e3e3d5a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 16:01:41 crc kubenswrapper[4708]: body: Mar 20 16:01:41 crc kubenswrapper[4708]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:08.160077146 +0000 UTC m=+22.834413881,LastTimestamp:2026-03-20 16:01:28.159683743 +0000 UTC m=+42.834020458,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:01:41 crc kubenswrapper[4708]: > Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.172390 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e98091e43d38d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e98091e43d38d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:01:08.160443277 +0000 UTC m=+22.834779992,LastTimestamp:2026-03-20 16:01:28.159747194 +0000 UTC m=+42.834083909,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.176482 4708 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9806ca24321e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 16:01:41 crc kubenswrapper[4708]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9806ca24321e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 16:01:41 crc kubenswrapper[4708]: body: Mar 20 16:01:41 crc kubenswrapper[4708]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:00:58.159149598 +0000 UTC m=+12.833486343,LastTimestamp:2026-03-20 16:01:38.160413448 +0000 UTC m=+52.834750203,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 16:01:41 crc kubenswrapper[4708]: > Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.970050 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 16:01:41 crc kubenswrapper[4708]: I0320 16:01:41.978111 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:41 crc kubenswrapper[4708]: I0320 16:01:41.979585 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:41 crc kubenswrapper[4708]: I0320 16:01:41.979618 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:41 crc kubenswrapper[4708]: I0320 16:01:41.979627 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:41 crc kubenswrapper[4708]: I0320 16:01:41.979654 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:41 crc kubenswrapper[4708]: E0320 16:01:41.983374 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 16:01:42 crc kubenswrapper[4708]: I0320 16:01:42.058776 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:43 crc kubenswrapper[4708]: I0320 16:01:43.057600 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:44 crc kubenswrapper[4708]: I0320 16:01:44.058435 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:45 crc kubenswrapper[4708]: I0320 16:01:45.060869 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:45 crc kubenswrapper[4708]: I0320 16:01:45.166540 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:45 crc kubenswrapper[4708]: I0320 16:01:45.166752 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:45 crc kubenswrapper[4708]: I0320 16:01:45.167996 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:45 crc kubenswrapper[4708]: I0320 16:01:45.168031 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:45 crc kubenswrapper[4708]: I0320 16:01:45.168041 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:45 crc kubenswrapper[4708]: I0320 16:01:45.170858 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:01:45 crc kubenswrapper[4708]: I0320 16:01:45.402616 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:45 crc kubenswrapper[4708]: I0320 16:01:45.403797 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:45 crc kubenswrapper[4708]: I0320 16:01:45.403846 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:45 crc kubenswrapper[4708]: I0320 16:01:45.403861 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:46 crc kubenswrapper[4708]: I0320 16:01:46.059374 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:46 crc kubenswrapper[4708]: E0320 16:01:46.224576 4708 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:01:47 crc kubenswrapper[4708]: I0320 16:01:47.059766 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:48 crc kubenswrapper[4708]: I0320 16:01:48.063416 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:48 crc kubenswrapper[4708]: E0320 16:01:48.978152 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 16:01:48 crc kubenswrapper[4708]: I0320 16:01:48.984371 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:48 crc kubenswrapper[4708]: I0320 16:01:48.986008 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:48 crc kubenswrapper[4708]: I0320 16:01:48.986039 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:48 crc kubenswrapper[4708]: I0320 16:01:48.986049 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:48 crc kubenswrapper[4708]: I0320 16:01:48.986075 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:48 crc kubenswrapper[4708]: E0320 16:01:48.987839 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 16:01:49 crc kubenswrapper[4708]: I0320 16:01:49.058783 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:50 crc kubenswrapper[4708]: I0320 16:01:50.063469 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:51 crc kubenswrapper[4708]: I0320 16:01:51.059492 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:51 crc kubenswrapper[4708]: I0320 16:01:51.110741 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:51 crc kubenswrapper[4708]: I0320 16:01:51.112259 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:51 crc kubenswrapper[4708]: I0320 16:01:51.112316 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:51 crc kubenswrapper[4708]: I0320 16:01:51.112334 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:51 crc kubenswrapper[4708]: I0320 16:01:51.113180 4708 scope.go:117] "RemoveContainer" containerID="38c50e977ead1084a0ad861a0da76dafba67a88c7651e6ee33853138d5b6f513" Mar 20 16:01:51 crc kubenswrapper[4708]: I0320 16:01:51.432446 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 16:01:51 crc kubenswrapper[4708]: I0320 16:01:51.437194 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670"} Mar 20 16:01:51 crc kubenswrapper[4708]: I0320 16:01:51.437502 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:51 crc kubenswrapper[4708]: I0320 16:01:51.438589 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:51 crc kubenswrapper[4708]: I0320 16:01:51.438616 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:51 crc kubenswrapper[4708]: I0320 16:01:51.438628 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:52 crc kubenswrapper[4708]: I0320 16:01:52.061484 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:52 crc kubenswrapper[4708]: I0320 16:01:52.442046 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 16:01:52 crc kubenswrapper[4708]: I0320 16:01:52.443038 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 16:01:52 crc kubenswrapper[4708]: I0320 16:01:52.445198 4708 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670" exitCode=255 Mar 20 16:01:52 crc kubenswrapper[4708]: I0320 16:01:52.445252 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670"} Mar 20 16:01:52 crc kubenswrapper[4708]: I0320 16:01:52.445302 4708 scope.go:117] "RemoveContainer" containerID="38c50e977ead1084a0ad861a0da76dafba67a88c7651e6ee33853138d5b6f513" Mar 20 16:01:52 crc kubenswrapper[4708]: I0320 16:01:52.445524 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:52 crc kubenswrapper[4708]: I0320 16:01:52.446659 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:52 crc kubenswrapper[4708]: I0320 16:01:52.446717 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:52 crc kubenswrapper[4708]: I0320 16:01:52.446728 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:52 crc kubenswrapper[4708]: I0320 16:01:52.447425 4708 scope.go:117] "RemoveContainer" containerID="8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670" Mar 20 16:01:52 crc kubenswrapper[4708]: E0320 16:01:52.447643 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:53 crc kubenswrapper[4708]: I0320 16:01:53.060309 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:53 crc kubenswrapper[4708]: I0320 16:01:53.450411 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 16:01:54 crc kubenswrapper[4708]: I0320 16:01:54.058732 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:55 crc kubenswrapper[4708]: I0320 16:01:55.061820 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:55 crc kubenswrapper[4708]: E0320 16:01:55.987172 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 16:01:55 crc kubenswrapper[4708]: I0320 16:01:55.988146 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:55 crc kubenswrapper[4708]: I0320 16:01:55.989638 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:55 crc kubenswrapper[4708]: I0320 16:01:55.989738 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:55 crc kubenswrapper[4708]: I0320 16:01:55.989760 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:55 crc kubenswrapper[4708]: I0320 16:01:55.989797 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:01:55 crc kubenswrapper[4708]: E0320 16:01:55.996567 4708 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 16:01:56 crc kubenswrapper[4708]: I0320 16:01:56.055426 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:56 crc kubenswrapper[4708]: E0320 16:01:56.224751 4708 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 16:01:56 crc kubenswrapper[4708]: I0320 16:01:56.886779 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:56 crc kubenswrapper[4708]: I0320 16:01:56.887008 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:56 crc kubenswrapper[4708]: I0320 16:01:56.888691 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:56 crc kubenswrapper[4708]: I0320 16:01:56.888810 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:56 crc kubenswrapper[4708]: I0320 16:01:56.888895 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:56 crc kubenswrapper[4708]: I0320 16:01:56.889655 4708 scope.go:117] "RemoveContainer" containerID="8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670" Mar 20 16:01:56 crc kubenswrapper[4708]: E0320 16:01:56.889982 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:57 crc kubenswrapper[4708]: I0320 16:01:57.059235 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:57 crc kubenswrapper[4708]: I0320 16:01:57.067539 4708 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 16:01:57 crc kubenswrapper[4708]: I0320 16:01:57.083243 4708 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 16:01:57 crc kubenswrapper[4708]: I0320 16:01:57.216202 4708 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:01:57 crc kubenswrapper[4708]: I0320 16:01:57.464655 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:01:57 crc kubenswrapper[4708]: I0320 16:01:57.466383 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:01:57 crc kubenswrapper[4708]: I0320 16:01:57.466466 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:01:57 crc kubenswrapper[4708]: I0320 16:01:57.466488 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:01:57 crc kubenswrapper[4708]: I0320 16:01:57.467799 4708 scope.go:117] "RemoveContainer" containerID="8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670" Mar 20 16:01:57 crc kubenswrapper[4708]: E0320 16:01:57.468222 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:01:58 crc kubenswrapper[4708]: I0320 16:01:58.059975 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:01:59 crc kubenswrapper[4708]: I0320 16:01:59.059818 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:00 crc kubenswrapper[4708]: I0320 16:02:00.060077 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:01 crc kubenswrapper[4708]: I0320 16:02:01.060713 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:02 crc kubenswrapper[4708]: I0320 16:02:02.058905 4708 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 16:02:02 crc kubenswrapper[4708]: I0320 16:02:02.199295 4708 csr.go:261] certificate signing request csr-cjmgl is approved, waiting to be issued Mar 20 16:02:02 crc kubenswrapper[4708]: I0320 16:02:02.206443 4708 csr.go:257] certificate signing request csr-cjmgl is issued Mar 20 16:02:02 crc kubenswrapper[4708]: I0320 16:02:02.284266 4708 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 16:02:02 crc kubenswrapper[4708]: I0320 16:02:02.901293 4708 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 16:02:02 crc kubenswrapper[4708]: I0320 16:02:02.997382 4708 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 16:02:02 crc kubenswrapper[4708]: I0320 16:02:02.999550 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:02 crc kubenswrapper[4708]: I0320 16:02:02.999602 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:02.999617 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:02.999772 4708 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.009326 4708 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.009888 4708 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.010006 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.013430 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.013612 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.013789 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.013963 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.014100 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:03Z","lastTransitionTime":"2026-03-20T16:02:03Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.030543 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.040354 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.040589 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.040698 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.040856 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.040983 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:03Z","lastTransitionTime":"2026-03-20T16:02:03Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.052395 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.062792 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.062886 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.062916 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.062949 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.062971 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:03Z","lastTransitionTime":"2026-03-20T16:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.077615 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.087016 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.087080 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.087093 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.087118 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.087132 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:03Z","lastTransitionTime":"2026-03-20T16:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.102820 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.103010 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.103053 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.205138 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.208569 4708 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-03 13:30:11.596597481 +0000 UTC Mar 20 16:02:03 crc kubenswrapper[4708]: I0320 16:02:03.208655 4708 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6933h28m8.387951037s for next certificate rotation Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.306055 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.406510 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.507083 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.607501 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.708712 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.809818 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:03 crc kubenswrapper[4708]: E0320 16:02:03.910292 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:04 crc kubenswrapper[4708]: E0320 16:02:04.010729 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:04 crc kubenswrapper[4708]: E0320 16:02:04.110979 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:04 crc kubenswrapper[4708]: E0320 16:02:04.212180 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:04 crc kubenswrapper[4708]: E0320 16:02:04.313211 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:04 crc kubenswrapper[4708]: E0320 16:02:04.414109 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:04 crc kubenswrapper[4708]: E0320 16:02:04.514656 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:04 crc kubenswrapper[4708]: E0320 16:02:04.615778 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:04 crc kubenswrapper[4708]: E0320 16:02:04.716378 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:04 crc kubenswrapper[4708]: E0320 16:02:04.817446 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:04 crc kubenswrapper[4708]: E0320 16:02:04.918636 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:05 crc kubenswrapper[4708]: E0320 16:02:05.019558 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:05 crc kubenswrapper[4708]: E0320 16:02:05.120705 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:05 crc kubenswrapper[4708]: E0320 16:02:05.221869 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:05 crc kubenswrapper[4708]: E0320 16:02:05.322489 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:05 crc kubenswrapper[4708]: E0320 16:02:05.423541 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:05 crc kubenswrapper[4708]: E0320 16:02:05.523784 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:05 crc kubenswrapper[4708]: E0320 16:02:05.624103 4708 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.691251 4708 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.727616 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.728028 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.728135 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.728236 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.728329 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:05Z","lastTransitionTime":"2026-03-20T16:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.832030 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.832111 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.832128 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.832153 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.832170 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:05Z","lastTransitionTime":"2026-03-20T16:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.935505 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.935573 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.935843 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.935962 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:05 crc kubenswrapper[4708]: I0320 16:02:05.935978 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:05Z","lastTransitionTime":"2026-03-20T16:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.038350 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.038723 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.038846 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.038940 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.039064 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:06Z","lastTransitionTime":"2026-03-20T16:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.084309 4708 apiserver.go:52] "Watching apiserver" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.091692 4708 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.092239 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.092938 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.093408 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.093550 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.094253 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.094626 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.094637 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.095203 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.095343 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.096234 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.097444 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.097940 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.098358 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.099178 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.099563 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.099921 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.100228 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.100260 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.105329 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.129250 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.141325 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.141360 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.141369 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.141384 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.141393 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:06Z","lastTransitionTime":"2026-03-20T16:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.150540 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.162152 4708 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.164404 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.176398 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.189480 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.203234 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212003 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212075 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212115 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212151 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212193 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212226 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212265 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212297 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212328 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212361 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212392 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212423 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212458 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212492 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212527 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212573 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212633 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212707 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212750 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212811 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212853 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212950 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212998 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213038 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213074 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213108 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213148 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213212 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213250 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213286 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213321 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213394 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213442 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213515 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213574 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213699 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213753 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213789 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213822 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213860 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213895 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213972 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214027 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214079 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214130 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214183 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214240 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214297 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214345 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214392 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214431 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214467 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214507 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214558 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214609 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214647 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214762 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214816 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214864 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214901 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214945 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214994 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215037 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215073 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215107 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215154 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215204 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215262 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212514 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212566 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215325 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215382 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215440 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215485 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215535 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215582 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215633 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215826 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215891 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215947 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215998 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216047 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216102 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216154 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216203 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216261 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216312 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216356 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216411 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216449 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216926 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217003 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217047 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217083 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217121 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217146 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217185 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217226 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217251 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217281 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217307 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217378 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217412 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217446 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217526 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217566 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217653 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218222 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218250 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218274 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218300 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218373 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218403 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218429 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218453 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218483 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218510 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218536 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218560 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218583 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218605 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218629 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218653 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218696 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218722 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218761 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218783 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218803 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218826 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218850 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218874 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218902 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219027 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219054 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219076 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219100 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219122 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219144 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219166 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219192 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219224 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219251 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219273 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219295 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219318 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219341 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219362 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219386 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219408 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219431 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219456 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219480 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219502 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219537 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219570 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219606 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219642 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219697 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219739 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219779 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219813 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219845 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219876 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219916 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219952 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219982 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220018 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220053 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220087 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220120 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220175 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220213 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220247 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220279 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220312 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220346 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220380 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220412 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220447 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220497 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220527 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220557 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220588 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220618 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220649 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220706 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220740 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220772 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220807 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220838 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220871 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220911 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220945 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220981 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221012 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221036 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221062 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221138 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221181 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221214 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221242 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221269 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221295 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221318 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221373 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221400 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221427 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221458 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221482 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221560 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221586 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221702 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221725 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219481 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212700 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.212974 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213516 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213702 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.213877 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214034 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214181 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214184 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214455 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214549 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214696 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214710 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214852 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.214996 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215133 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215273 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215376 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215477 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215617 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.215874 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216033 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216181 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216314 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.216491 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.217512 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.218661 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219163 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219261 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219550 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.219858 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220013 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220149 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220203 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220273 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220384 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220749 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220774 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.220767 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221440 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.221921 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.222337 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.222403 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.222783 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.222804 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.223019 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.223566 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.223597 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.223720 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.223863 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.223911 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.224038 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.224099 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.224119 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.224302 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.224821 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.224850 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.224891 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.225510 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.225559 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.225645 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.226083 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.226109 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.226134 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.226141 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.226625 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.226771 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.226799 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.226945 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.227585 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.228032 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.228503 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.228903 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.228990 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.229087 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.229211 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.229580 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.229649 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.229750 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.229885 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.230190 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.230381 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.230632 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.232373 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.232597 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.234485 4708 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.234861 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:02:06.734822225 +0000 UTC m=+81.409158980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.235971 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:06.735895294 +0000 UTC m=+81.410232039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.236935 4708 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.238295 4708 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.238414 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:06.73838194 +0000 UTC m=+81.412718745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.239606 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.240589 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.241218 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.241285 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.241894 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.242320 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.242811 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.243243 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.243264 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.243492 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.244264 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.246551 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.246613 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.246638 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.246702 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.246732 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:06Z","lastTransitionTime":"2026-03-20T16:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.250162 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.251276 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.251491 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.251634 4708 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.252741 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:06.752641979 +0000 UTC m=+81.426978884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.254455 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.256027 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.258073 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.262212 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.262387 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.264077 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.264287 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.265230 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.267954 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.268888 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.268929 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.268954 4708 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.269027 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:06.769001533 +0000 UTC m=+81.443338288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.269118 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.270422 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.270796 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.273821 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.274049 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.274768 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.275977 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.276021 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.276285 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.276291 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.276579 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.277025 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.277068 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.275953 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.277308 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.277110 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.277338 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.277398 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.278335 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.278406 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.278799 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.279031 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.279585 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.280183 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.281736 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.281785 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.281872 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.282219 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.282774 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.283113 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.286087 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.286122 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.283184 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.286128 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.286285 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.286492 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.286801 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.286837 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.286547 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.286887 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.286607 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.286633 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.286644 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.287397 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.287615 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.288417 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.288650 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.291374 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.291850 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.291924 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.292436 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.292471 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.293314 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.293555 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.293542 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.293588 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.293633 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.293927 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.293957 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.294029 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.294119 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.294175 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.294266 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.294546 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.294651 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.294727 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.294872 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.294996 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.295193 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.295874 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.296044 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.296280 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.296340 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.296067 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.297457 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.297795 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.298268 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.298608 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.298834 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.299102 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.299249 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.299301 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.299412 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.299491 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.299837 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.299925 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.300451 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.300623 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.300971 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.304260 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.309514 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.310046 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.310907 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.313645 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.314460 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.318663 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.321033 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323195 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323277 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323363 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323385 4708 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323401 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323417 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323434 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323450 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323467 4708 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323484 4708 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323706 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323725 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323758 4708 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323792 4708 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323814 4708 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323835 4708 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323850 4708 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323866 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323882 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323898 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323914 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323928 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323942 4708 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323956 4708 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323971 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.323985 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324000 4708 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324014 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324027 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324044 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324059 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324074 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324089 4708 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324103 4708 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324121 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324135 4708 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324151 4708 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324171 4708 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324188 4708 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324201 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324215 4708 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324229 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324243 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324256 4708 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324271 4708 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324287 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324301 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324314 4708 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324329 4708 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324343 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324358 4708 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324372 4708 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324388 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324401 4708 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324414 4708 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324427 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324442 4708 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324457 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324470 4708 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324483 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324497 4708 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324511 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324526 4708 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324544 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324560 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324575 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324589 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324603 4708 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324619 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324634 4708 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324649 4708 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324662 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324706 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324726 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324743 4708 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324759 4708 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324777 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324795 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324812 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324831 4708 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324850 4708 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324865 4708 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324882 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324900 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324919 4708 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324963 4708 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324978 4708 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.324990 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325003 4708 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325017 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325030 4708 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325045 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325059 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325074 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325089 4708 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325102 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325116 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325131 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325145 4708 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325159 4708 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325173 4708 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325187 4708 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325201 4708 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325214 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325228 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325242 4708 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325256 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325269 4708 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325282 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325297 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325309 4708 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325323 4708 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325336 4708 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325349 4708 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325362 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325377 4708 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325392 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325406 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325419 4708 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325432 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325446 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325462 4708 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325477 4708 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325489 4708 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325503 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325516 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325530 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325544 4708 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325559 4708 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325574 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325587 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325600 4708 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325616 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325632 4708 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325647 4708 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325661 4708 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325707 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325727 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325741 4708 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325756 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325769 4708 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325782 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325796 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325809 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325822 4708 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325835 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325848 4708 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325861 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325875 4708 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325888 4708 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325900 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325914 4708 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325928 4708 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325941 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325954 4708 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325968 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325981 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.325995 4708 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326009 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326022 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326037 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326051 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326067 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326080 4708 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326093 4708 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326108 4708 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326122 4708 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326135 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326150 4708 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326164 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326178 4708 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326191 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326205 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326220 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326238 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326252 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326266 4708 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326280 4708 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326293 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326306 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326320 4708 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326334 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326348 4708 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326361 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326374 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326388 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326401 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326416 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326431 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326445 4708 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326460 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326474 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.326488 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.331858 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.332984 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.334913 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.340787 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.341243 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.349552 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.349614 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.349640 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.349707 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.349746 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:06Z","lastTransitionTime":"2026-03-20T16:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.423757 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.427543 4708 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.427580 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.427601 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.434607 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.441070 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.454829 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.454869 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.454882 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.454900 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.454910 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:06Z","lastTransitionTime":"2026-03-20T16:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.456806 4708 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:02:06 crc kubenswrapper[4708]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 16:02:06 crc kubenswrapper[4708]: if [[ -f "/env/_master" ]]; then Mar 20 16:02:06 crc kubenswrapper[4708]: set -o allexport Mar 20 16:02:06 crc kubenswrapper[4708]: source "/env/_master" Mar 20 16:02:06 crc kubenswrapper[4708]: set +o allexport Mar 20 16:02:06 crc kubenswrapper[4708]: fi Mar 20 16:02:06 crc kubenswrapper[4708]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 16:02:06 crc kubenswrapper[4708]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 16:02:06 crc kubenswrapper[4708]: ho_enable="--enable-hybrid-overlay" Mar 20 16:02:06 crc kubenswrapper[4708]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 16:02:06 crc kubenswrapper[4708]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 16:02:06 crc kubenswrapper[4708]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 16:02:06 crc kubenswrapper[4708]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 16:02:06 crc kubenswrapper[4708]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 16:02:06 crc kubenswrapper[4708]: --webhook-host=127.0.0.1 \ Mar 20 16:02:06 crc kubenswrapper[4708]: --webhook-port=9743 \ Mar 20 16:02:06 crc kubenswrapper[4708]: ${ho_enable} \ Mar 20 16:02:06 crc kubenswrapper[4708]: --enable-interconnect \ Mar 20 16:02:06 crc kubenswrapper[4708]: --disable-approver \ Mar 20 16:02:06 crc kubenswrapper[4708]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 16:02:06 crc kubenswrapper[4708]: --wait-for-kubernetes-api=200s \ Mar 20 16:02:06 crc kubenswrapper[4708]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 16:02:06 crc kubenswrapper[4708]: --loglevel="${LOGLEVEL}" Mar 20 16:02:06 crc kubenswrapper[4708]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 16:02:06 crc kubenswrapper[4708]: > logger="UnhandledError" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.457736 4708 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:02:06 crc kubenswrapper[4708]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 16:02:06 crc kubenswrapper[4708]: set -o allexport Mar 20 16:02:06 crc kubenswrapper[4708]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 16:02:06 crc kubenswrapper[4708]: source /etc/kubernetes/apiserver-url.env Mar 20 16:02:06 crc kubenswrapper[4708]: else Mar 20 16:02:06 crc kubenswrapper[4708]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 16:02:06 crc kubenswrapper[4708]: exit 1 Mar 20 16:02:06 crc kubenswrapper[4708]: fi Mar 20 16:02:06 crc kubenswrapper[4708]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 16:02:06 crc kubenswrapper[4708]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 16:02:06 crc kubenswrapper[4708]: > logger="UnhandledError" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.458944 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.461376 4708 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:02:06 crc kubenswrapper[4708]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 16:02:06 crc kubenswrapper[4708]: if [[ -f "/env/_master" ]]; then Mar 20 16:02:06 crc kubenswrapper[4708]: set -o allexport Mar 20 16:02:06 crc kubenswrapper[4708]: source "/env/_master" Mar 20 16:02:06 crc kubenswrapper[4708]: set +o allexport Mar 20 16:02:06 crc kubenswrapper[4708]: fi Mar 20 16:02:06 crc kubenswrapper[4708]: Mar 20 16:02:06 crc kubenswrapper[4708]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 16:02:06 crc kubenswrapper[4708]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 16:02:06 crc kubenswrapper[4708]: --disable-webhook \ Mar 20 16:02:06 crc kubenswrapper[4708]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 16:02:06 crc kubenswrapper[4708]: --loglevel="${LOGLEVEL}" Mar 20 16:02:06 crc kubenswrapper[4708]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 16:02:06 crc kubenswrapper[4708]: > logger="UnhandledError" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.462716 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 16:02:06 crc kubenswrapper[4708]: W0320 16:02:06.463163 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-903620482b1d353c5bdb67ca4203de6fcdebb6195e2b485d6a0c960789168165 WatchSource:0}: Error finding container 903620482b1d353c5bdb67ca4203de6fcdebb6195e2b485d6a0c960789168165: Status 404 returned error can't find the container with id 903620482b1d353c5bdb67ca4203de6fcdebb6195e2b485d6a0c960789168165 Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.467150 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.468876 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.495903 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"903620482b1d353c5bdb67ca4203de6fcdebb6195e2b485d6a0c960789168165"} Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.497593 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.497626 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"27e6e97a60dba08d27824bf5a7354a90d239cfef445eaf2fe3d285e02e29b318"} Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.498570 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"976b39106a3655e376e99151e6a87f8666946e694eb862871adb161f25a2919c"} Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.498727 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.499403 4708 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:02:06 crc kubenswrapper[4708]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 16:02:06 crc kubenswrapper[4708]: if [[ -f "/env/_master" ]]; then Mar 20 16:02:06 crc kubenswrapper[4708]: set -o allexport Mar 20 16:02:06 crc kubenswrapper[4708]: source "/env/_master" Mar 20 16:02:06 crc kubenswrapper[4708]: set +o allexport Mar 20 16:02:06 crc kubenswrapper[4708]: fi Mar 20 16:02:06 crc kubenswrapper[4708]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 16:02:06 crc kubenswrapper[4708]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 16:02:06 crc kubenswrapper[4708]: ho_enable="--enable-hybrid-overlay" Mar 20 16:02:06 crc kubenswrapper[4708]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 16:02:06 crc kubenswrapper[4708]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 16:02:06 crc kubenswrapper[4708]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 16:02:06 crc kubenswrapper[4708]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 16:02:06 crc kubenswrapper[4708]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 16:02:06 crc kubenswrapper[4708]: --webhook-host=127.0.0.1 \ Mar 20 16:02:06 crc kubenswrapper[4708]: --webhook-port=9743 \ Mar 20 16:02:06 crc kubenswrapper[4708]: ${ho_enable} \ Mar 20 16:02:06 crc kubenswrapper[4708]: --enable-interconnect \ Mar 20 16:02:06 crc kubenswrapper[4708]: --disable-approver \ Mar 20 16:02:06 crc kubenswrapper[4708]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 16:02:06 crc kubenswrapper[4708]: --wait-for-kubernetes-api=200s \ Mar 20 16:02:06 crc kubenswrapper[4708]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 16:02:06 crc kubenswrapper[4708]: --loglevel="${LOGLEVEL}" Mar 20 16:02:06 crc kubenswrapper[4708]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 16:02:06 crc kubenswrapper[4708]: > logger="UnhandledError" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.502535 4708 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:02:06 crc kubenswrapper[4708]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 16:02:06 crc kubenswrapper[4708]: if [[ -f "/env/_master" ]]; then Mar 20 16:02:06 crc kubenswrapper[4708]: set -o allexport Mar 20 16:02:06 crc kubenswrapper[4708]: source "/env/_master" Mar 20 16:02:06 crc kubenswrapper[4708]: set +o allexport Mar 20 16:02:06 crc kubenswrapper[4708]: fi Mar 20 16:02:06 crc kubenswrapper[4708]: Mar 20 16:02:06 crc kubenswrapper[4708]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 16:02:06 crc kubenswrapper[4708]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 16:02:06 crc kubenswrapper[4708]: --disable-webhook \ Mar 20 16:02:06 crc kubenswrapper[4708]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 16:02:06 crc kubenswrapper[4708]: --loglevel="${LOGLEVEL}" Mar 20 16:02:06 crc kubenswrapper[4708]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 16:02:06 crc kubenswrapper[4708]: > logger="UnhandledError" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.503085 4708 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:02:06 crc kubenswrapper[4708]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 16:02:06 crc kubenswrapper[4708]: set -o allexport Mar 20 16:02:06 crc kubenswrapper[4708]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 16:02:06 crc kubenswrapper[4708]: source /etc/kubernetes/apiserver-url.env Mar 20 16:02:06 crc kubenswrapper[4708]: else Mar 20 16:02:06 crc kubenswrapper[4708]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 16:02:06 crc kubenswrapper[4708]: exit 1 Mar 20 16:02:06 crc kubenswrapper[4708]: fi Mar 20 16:02:06 crc kubenswrapper[4708]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 16:02:06 crc kubenswrapper[4708]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 16:02:06 crc kubenswrapper[4708]: > logger="UnhandledError" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.504069 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.505134 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.512369 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.524363 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.534263 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.543901 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.552663 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.558289 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.558350 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.558376 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.558413 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.558440 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:06Z","lastTransitionTime":"2026-03-20T16:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.572629 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.584746 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.595932 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.606913 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.617980 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.628810 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.639359 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.661502 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.661530 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.661545 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.661564 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.661573 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:06Z","lastTransitionTime":"2026-03-20T16:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.764425 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.764466 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.764476 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.764490 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.764502 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:06Z","lastTransitionTime":"2026-03-20T16:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.831453 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.831837 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.831877 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.831910 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.831945 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832033 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832082 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832093 4708 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832101 4708 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832132 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832200 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832221 4708 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832226 4708 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832175 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:07.832152513 +0000 UTC m=+82.506489228 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832306 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:07.832280047 +0000 UTC m=+82.506616772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832337 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:07.832328388 +0000 UTC m=+82.506665113 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832357 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:07.832350768 +0000 UTC m=+82.506687503 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:06 crc kubenswrapper[4708]: E0320 16:02:06.832385 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:02:07.832373129 +0000 UTC m=+82.506710164 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.867292 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.867354 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.867364 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.867385 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.867403 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:06Z","lastTransitionTime":"2026-03-20T16:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.969560 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.969619 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.969646 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.969694 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:06 crc kubenswrapper[4708]: I0320 16:02:06.969711 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:06Z","lastTransitionTime":"2026-03-20T16:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.073107 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.073191 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.073217 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.073255 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.073281 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:07Z","lastTransitionTime":"2026-03-20T16:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.176573 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.176648 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.176705 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.176732 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.176751 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:07Z","lastTransitionTime":"2026-03-20T16:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.279461 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.279536 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.279549 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.279568 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.279589 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:07Z","lastTransitionTime":"2026-03-20T16:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.382094 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.382157 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.382172 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.382195 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.382210 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:07Z","lastTransitionTime":"2026-03-20T16:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.484843 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.484896 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.484908 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.484928 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.484939 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:07Z","lastTransitionTime":"2026-03-20T16:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.587769 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.587809 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.587824 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.587840 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.587849 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:07Z","lastTransitionTime":"2026-03-20T16:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.690502 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.690564 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.690574 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.690590 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.690601 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:07Z","lastTransitionTime":"2026-03-20T16:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.792955 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.793006 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.793015 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.793032 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.793043 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:07Z","lastTransitionTime":"2026-03-20T16:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.842870 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.842981 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843064 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:02:09.843032784 +0000 UTC m=+84.517369519 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843074 4708 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.843135 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843152 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:09.843140337 +0000 UTC m=+84.517477212 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.843200 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843263 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843302 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843336 4708 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.843264 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843405 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843430 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843448 4708 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843336 4708 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843462 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:09.843420244 +0000 UTC m=+84.517756999 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843509 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:09.843479556 +0000 UTC m=+84.517816271 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:07 crc kubenswrapper[4708]: E0320 16:02:07.843527 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:09.843519857 +0000 UTC m=+84.517856572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.895541 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.895628 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.895647 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.895715 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.895736 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:07Z","lastTransitionTime":"2026-03-20T16:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.998459 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.998554 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.998589 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.998618 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:07 crc kubenswrapper[4708]: I0320 16:02:07.998641 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:07Z","lastTransitionTime":"2026-03-20T16:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.101848 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.101912 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.101930 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.101956 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.101974 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:08Z","lastTransitionTime":"2026-03-20T16:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.110866 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.110936 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:08 crc kubenswrapper[4708]: E0320 16:02:08.111058 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.110864 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:08 crc kubenswrapper[4708]: E0320 16:02:08.111199 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:08 crc kubenswrapper[4708]: E0320 16:02:08.111473 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.119741 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.120488 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.122263 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.123526 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.125258 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.126327 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.127295 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.128711 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.129566 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.130988 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.131789 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.133413 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.134591 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.135958 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.137604 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.138360 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.140053 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.140594 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.141509 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.143199 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.143864 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.145248 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.145878 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.147330 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.148162 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.149399 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.151658 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.152639 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.154140 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.154839 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.156201 4708 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.156360 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.158424 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.159656 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.160254 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.162484 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.163370 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.164572 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.165458 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.166935 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.167549 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.168819 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.169648 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.171062 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.171653 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.172839 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.173470 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.175090 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.175644 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.176716 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.177338 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.178498 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.179299 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.180033 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.205680 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.205740 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.205752 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.205770 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.205781 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:08Z","lastTransitionTime":"2026-03-20T16:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.308037 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.308107 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.308119 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.308140 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.308152 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:08Z","lastTransitionTime":"2026-03-20T16:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.411510 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.411561 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.411589 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.411609 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.411624 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:08Z","lastTransitionTime":"2026-03-20T16:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.513561 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.513697 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.513716 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.513733 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.513744 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:08Z","lastTransitionTime":"2026-03-20T16:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.616141 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.616186 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.616198 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.616217 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.616228 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:08Z","lastTransitionTime":"2026-03-20T16:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.718286 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.718338 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.718349 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.718370 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.718384 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:08Z","lastTransitionTime":"2026-03-20T16:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.820934 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.820976 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.820985 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.821000 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.821010 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:08Z","lastTransitionTime":"2026-03-20T16:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.923518 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.923562 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.923577 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.923596 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:08 crc kubenswrapper[4708]: I0320 16:02:08.923619 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:08Z","lastTransitionTime":"2026-03-20T16:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.026580 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.027288 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.027315 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.027349 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.027374 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:09Z","lastTransitionTime":"2026-03-20T16:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.131010 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.131066 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.131083 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.131107 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.131124 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:09Z","lastTransitionTime":"2026-03-20T16:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.233227 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.233300 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.233320 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.233348 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.233367 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:09Z","lastTransitionTime":"2026-03-20T16:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.337033 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.337204 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.337248 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.337280 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.337304 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:09Z","lastTransitionTime":"2026-03-20T16:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.439862 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.439915 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.439927 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.439946 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.439959 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:09Z","lastTransitionTime":"2026-03-20T16:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.542768 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.542827 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.542842 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.542864 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.542878 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:09Z","lastTransitionTime":"2026-03-20T16:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.647245 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.647311 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.647328 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.647352 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.647369 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:09Z","lastTransitionTime":"2026-03-20T16:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.750461 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.750505 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.750539 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.750555 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.750564 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:09Z","lastTransitionTime":"2026-03-20T16:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.853425 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.853492 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.853506 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.853528 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.853542 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:09Z","lastTransitionTime":"2026-03-20T16:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.862992 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.863095 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.863132 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863209 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:02:13.863175753 +0000 UTC m=+88.537512618 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863276 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.863291 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863304 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863374 4708 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863381 4708 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.863343 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863442 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:13.86342021 +0000 UTC m=+88.537757125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863493 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:13.863454001 +0000 UTC m=+88.537790846 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863510 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863309 4708 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863536 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863555 4708 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863585 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:13.863564544 +0000 UTC m=+88.537901479 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:09 crc kubenswrapper[4708]: E0320 16:02:09.863620 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:13.863603345 +0000 UTC m=+88.537940170 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.957145 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.957205 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.957223 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.957248 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:09 crc kubenswrapper[4708]: I0320 16:02:09.957267 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:09Z","lastTransitionTime":"2026-03-20T16:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.060552 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.060609 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.060651 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.060698 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.060714 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:10Z","lastTransitionTime":"2026-03-20T16:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.110288 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.110322 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:10 crc kubenswrapper[4708]: E0320 16:02:10.110444 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.110489 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:10 crc kubenswrapper[4708]: E0320 16:02:10.110708 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:10 crc kubenswrapper[4708]: E0320 16:02:10.110856 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.156184 4708 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.163524 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.163596 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.163614 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.163643 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.163663 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:10Z","lastTransitionTime":"2026-03-20T16:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.268463 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.268549 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.268574 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.268605 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.268625 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:10Z","lastTransitionTime":"2026-03-20T16:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.372250 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.372327 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.372345 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.372375 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.372392 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:10Z","lastTransitionTime":"2026-03-20T16:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.476006 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.476081 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.476103 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.476129 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.476148 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:10Z","lastTransitionTime":"2026-03-20T16:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.579557 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.579628 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.579653 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.579727 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.579763 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:10Z","lastTransitionTime":"2026-03-20T16:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.682722 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.682783 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.682793 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.682815 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.682827 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:10Z","lastTransitionTime":"2026-03-20T16:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.785575 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.785629 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.785642 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.785664 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.785719 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:10Z","lastTransitionTime":"2026-03-20T16:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.888302 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.888352 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.888366 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.888385 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.888401 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:10Z","lastTransitionTime":"2026-03-20T16:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.991503 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.991555 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.991570 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.991589 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:10 crc kubenswrapper[4708]: I0320 16:02:10.991601 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:10Z","lastTransitionTime":"2026-03-20T16:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.095303 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.095358 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.095367 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.095388 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.095402 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:11Z","lastTransitionTime":"2026-03-20T16:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.198189 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.198259 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.198279 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.198298 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.198312 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:11Z","lastTransitionTime":"2026-03-20T16:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.301126 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.301217 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.301239 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.301272 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.301293 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:11Z","lastTransitionTime":"2026-03-20T16:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.404690 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.404740 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.404750 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.404779 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.404817 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:11Z","lastTransitionTime":"2026-03-20T16:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.508071 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.508122 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.508135 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.508156 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.508172 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:11Z","lastTransitionTime":"2026-03-20T16:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.611216 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.611259 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.611268 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.611284 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.611293 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:11Z","lastTransitionTime":"2026-03-20T16:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.714337 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.714389 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.714403 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.714420 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.714431 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:11Z","lastTransitionTime":"2026-03-20T16:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.817224 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.817270 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.817282 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.817317 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.817329 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:11Z","lastTransitionTime":"2026-03-20T16:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.919995 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.920055 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.920067 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.920088 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:11 crc kubenswrapper[4708]: I0320 16:02:11.920102 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:11Z","lastTransitionTime":"2026-03-20T16:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.023707 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.023875 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.023897 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.023921 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.023960 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:12Z","lastTransitionTime":"2026-03-20T16:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.110472 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.110480 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.110571 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:12 crc kubenswrapper[4708]: E0320 16:02:12.111431 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:12 crc kubenswrapper[4708]: E0320 16:02:12.111498 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:12 crc kubenswrapper[4708]: E0320 16:02:12.111525 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.126882 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.126932 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.126943 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.126964 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.126975 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:12Z","lastTransitionTime":"2026-03-20T16:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.229327 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.229373 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.229385 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.229405 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.229421 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:12Z","lastTransitionTime":"2026-03-20T16:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.331528 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.331569 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.331581 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.331598 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.331614 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:12Z","lastTransitionTime":"2026-03-20T16:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.434061 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.434105 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.434115 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.434132 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.434143 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:12Z","lastTransitionTime":"2026-03-20T16:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.537238 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.537324 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.537354 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.537395 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.537460 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:12Z","lastTransitionTime":"2026-03-20T16:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.640605 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.640644 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.640652 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.640687 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.640698 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:12Z","lastTransitionTime":"2026-03-20T16:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.744032 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.744095 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.744116 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.744144 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.744162 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:12Z","lastTransitionTime":"2026-03-20T16:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.846987 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.847033 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.847043 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.847060 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.847071 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:12Z","lastTransitionTime":"2026-03-20T16:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.949858 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.949913 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.949933 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.949956 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:12 crc kubenswrapper[4708]: I0320 16:02:12.949972 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:12Z","lastTransitionTime":"2026-03-20T16:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.053791 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.053867 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.053886 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.053914 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.053934 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.130306 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.130871 4708 scope.go:117] "RemoveContainer" containerID="8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670" Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.131226 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.158194 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.158253 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.158275 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.158300 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.158319 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.261808 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.261862 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.261874 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.261895 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.261907 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.364772 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.364823 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.364835 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.364855 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.364867 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.468176 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.468296 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.468311 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.468334 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.468349 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.491691 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.491747 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.491764 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.491787 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.491799 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.503462 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.508510 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.508558 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.508569 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.508587 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.508599 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.519932 4708 scope.go:117] "RemoveContainer" containerID="8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670" Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.520103 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.521508 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.526498 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.526589 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.526607 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.526657 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.526722 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.538895 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.543530 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.543619 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.543653 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.543698 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.543718 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.556544 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.567321 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.567422 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.567439 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.567460 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.567473 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.579777 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.580116 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.582281 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.582357 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.582371 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.582393 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.582425 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.686206 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.686262 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.686275 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.686296 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.686312 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.789820 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.789903 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.789977 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.790029 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.790053 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.893906 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.893998 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.894030 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.894050 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.894062 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.902322 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.902450 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.902494 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.902524 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.902554 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.902732 4708 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.902792 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:02:21.902722792 +0000 UTC m=+96.577059577 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.902803 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.902884 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:21.902862256 +0000 UTC m=+96.577198971 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.902917 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.902813 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.902957 4708 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.902966 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.902990 4708 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.903079 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:21.903054281 +0000 UTC m=+96.577391216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.903126 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:21.903105452 +0000 UTC m=+96.577442397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.903392 4708 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:13 crc kubenswrapper[4708]: E0320 16:02:13.903463 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:21.903445871 +0000 UTC m=+96.577782586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.996973 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.997054 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.997090 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.997124 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:13 crc kubenswrapper[4708]: I0320 16:02:13.997148 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:13Z","lastTransitionTime":"2026-03-20T16:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.100503 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.100558 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.100574 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.100601 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.100619 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:14Z","lastTransitionTime":"2026-03-20T16:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.110131 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.110126 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:14 crc kubenswrapper[4708]: E0320 16:02:14.110383 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.110126 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:14 crc kubenswrapper[4708]: E0320 16:02:14.110478 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:14 crc kubenswrapper[4708]: E0320 16:02:14.110625 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.248241 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.248306 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.248319 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.248350 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.248369 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:14Z","lastTransitionTime":"2026-03-20T16:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.351328 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.351367 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.351381 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.351402 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.351416 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:14Z","lastTransitionTime":"2026-03-20T16:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.455041 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.455097 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.455116 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.455143 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.455166 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:14Z","lastTransitionTime":"2026-03-20T16:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.559457 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.559525 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.559547 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.559578 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.559597 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:14Z","lastTransitionTime":"2026-03-20T16:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.663486 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.663548 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.663562 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.663580 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.663592 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:14Z","lastTransitionTime":"2026-03-20T16:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.766606 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.766658 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.766711 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.766744 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.766763 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:14Z","lastTransitionTime":"2026-03-20T16:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.868962 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.869000 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.869014 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.869026 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.869036 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:14Z","lastTransitionTime":"2026-03-20T16:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.972177 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.972240 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.972254 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.972278 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:14 crc kubenswrapper[4708]: I0320 16:02:14.972294 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:14Z","lastTransitionTime":"2026-03-20T16:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.076114 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.076224 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.076246 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.076278 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.076300 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:15Z","lastTransitionTime":"2026-03-20T16:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.179548 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.179632 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.179651 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.179731 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.179760 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:15Z","lastTransitionTime":"2026-03-20T16:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.283389 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.283492 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.283517 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.283554 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.283582 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:15Z","lastTransitionTime":"2026-03-20T16:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.385913 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.386000 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.386019 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.386046 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.386065 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:15Z","lastTransitionTime":"2026-03-20T16:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.488947 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.489052 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.489077 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.489114 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.489194 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:15Z","lastTransitionTime":"2026-03-20T16:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.592043 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.592105 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.592117 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.592141 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.592155 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:15Z","lastTransitionTime":"2026-03-20T16:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.696126 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.696183 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.696201 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.696228 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.696248 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:15Z","lastTransitionTime":"2026-03-20T16:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.800244 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.800323 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.800341 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.800368 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.800386 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:15Z","lastTransitionTime":"2026-03-20T16:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.903802 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.903892 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.903912 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.903952 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:15 crc kubenswrapper[4708]: I0320 16:02:15.903977 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:15Z","lastTransitionTime":"2026-03-20T16:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.006890 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.006959 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.006972 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.006996 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.007008 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:16Z","lastTransitionTime":"2026-03-20T16:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.110076 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.110128 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:16 crc kubenswrapper[4708]: E0320 16:02:16.110322 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.110428 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:16 crc kubenswrapper[4708]: E0320 16:02:16.110508 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.110604 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.110619 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:16 crc kubenswrapper[4708]: E0320 16:02:16.110647 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.110723 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.110757 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.110774 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:16Z","lastTransitionTime":"2026-03-20T16:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.123261 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.130357 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.136556 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.148617 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.167371 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.177705 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.190431 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.202186 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.215151 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.215240 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.215260 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.215291 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.215310 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:16Z","lastTransitionTime":"2026-03-20T16:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.319121 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.319169 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.319182 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.319204 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.319218 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:16Z","lastTransitionTime":"2026-03-20T16:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.348953 4708 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.422880 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.422953 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.422969 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.422992 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.423006 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:16Z","lastTransitionTime":"2026-03-20T16:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.526419 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.526497 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.526513 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.526536 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.526551 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:16Z","lastTransitionTime":"2026-03-20T16:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.628982 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.629041 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.629053 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.629077 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.629097 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:16Z","lastTransitionTime":"2026-03-20T16:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.731699 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.731745 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.731761 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.731778 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.731788 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:16Z","lastTransitionTime":"2026-03-20T16:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.834284 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.834367 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.834391 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.834422 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.834443 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:16Z","lastTransitionTime":"2026-03-20T16:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.937896 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.937955 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.937971 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.937995 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:16 crc kubenswrapper[4708]: I0320 16:02:16.938010 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:16Z","lastTransitionTime":"2026-03-20T16:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.040447 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.040488 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.040499 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.040516 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.040526 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:17Z","lastTransitionTime":"2026-03-20T16:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.143606 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.143692 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.143707 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.143726 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.143735 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:17Z","lastTransitionTime":"2026-03-20T16:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.247278 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.247346 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.247365 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.247388 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.247405 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:17Z","lastTransitionTime":"2026-03-20T16:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.351923 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.352031 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.352053 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.352115 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.352134 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:17Z","lastTransitionTime":"2026-03-20T16:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.454825 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.454874 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.454889 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.454910 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.454927 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:17Z","lastTransitionTime":"2026-03-20T16:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.558025 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.558066 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.558077 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.558094 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.558103 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:17Z","lastTransitionTime":"2026-03-20T16:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.661534 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.661583 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.661594 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.661613 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.661626 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:17Z","lastTransitionTime":"2026-03-20T16:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.763627 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.763698 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.763718 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.763739 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.763753 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:17Z","lastTransitionTime":"2026-03-20T16:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.867506 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.867581 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.867602 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.867623 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.867635 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:17Z","lastTransitionTime":"2026-03-20T16:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.971086 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.971159 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.971171 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.971189 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:17 crc kubenswrapper[4708]: I0320 16:02:17.971201 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:17Z","lastTransitionTime":"2026-03-20T16:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.073921 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.074012 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.074033 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.074054 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.074102 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:18Z","lastTransitionTime":"2026-03-20T16:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.110495 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:18 crc kubenswrapper[4708]: E0320 16:02:18.110625 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.110495 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.110765 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:18 crc kubenswrapper[4708]: E0320 16:02:18.111115 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:18 crc kubenswrapper[4708]: E0320 16:02:18.111260 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:18 crc kubenswrapper[4708]: E0320 16:02:18.115052 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 16:02:18 crc kubenswrapper[4708]: E0320 16:02:18.118417 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.122841 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.177163 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.177222 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.177239 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.177263 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:18 crc kubenswrapper[4708]: I0320 16:02:18.177280 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:18Z","lastTransitionTime":"2026-03-20T16:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.304754 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:19 crc kubenswrapper[4708]: E0320 16:02:19.304900 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.304983 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:19 crc kubenswrapper[4708]: E0320 16:02:19.305194 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.305823 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.305845 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.305854 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.305868 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.305879 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:19Z","lastTransitionTime":"2026-03-20T16:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.409261 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.409312 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.409323 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.409342 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.409357 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:19Z","lastTransitionTime":"2026-03-20T16:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.512482 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.512537 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.512554 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.512575 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.512588 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:19Z","lastTransitionTime":"2026-03-20T16:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.615020 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.615089 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.615099 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.615119 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.615129 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:19Z","lastTransitionTime":"2026-03-20T16:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.718185 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.718229 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.718241 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.718260 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.718275 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:19Z","lastTransitionTime":"2026-03-20T16:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.821184 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.821239 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.821250 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.821268 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.821286 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:19Z","lastTransitionTime":"2026-03-20T16:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.923754 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.923805 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.923816 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.923834 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:19 crc kubenswrapper[4708]: I0320 16:02:19.923846 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:19Z","lastTransitionTime":"2026-03-20T16:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.027047 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.027117 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.027129 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.027148 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.027164 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:20Z","lastTransitionTime":"2026-03-20T16:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.110723 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:20 crc kubenswrapper[4708]: E0320 16:02:20.110894 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.129737 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.129805 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.129824 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.129863 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.129886 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:20Z","lastTransitionTime":"2026-03-20T16:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.233352 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.233853 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.233948 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.234037 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.234132 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:20Z","lastTransitionTime":"2026-03-20T16:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.336574 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.336624 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.336634 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.336650 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.336663 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:20Z","lastTransitionTime":"2026-03-20T16:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.439950 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.439991 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.440002 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.440019 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.440032 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:20Z","lastTransitionTime":"2026-03-20T16:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.543102 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.543173 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.543192 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.543217 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.543237 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:20Z","lastTransitionTime":"2026-03-20T16:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.646073 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.646117 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.646127 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.646145 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.646155 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:20Z","lastTransitionTime":"2026-03-20T16:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.749235 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.749279 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.749294 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.749321 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.749335 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:20Z","lastTransitionTime":"2026-03-20T16:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.852287 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.852326 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.852336 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.852351 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.852361 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:20Z","lastTransitionTime":"2026-03-20T16:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.954996 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.955077 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.955097 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.955128 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:20 crc kubenswrapper[4708]: I0320 16:02:20.955149 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:20Z","lastTransitionTime":"2026-03-20T16:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.057769 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.058097 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.058183 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.058256 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.058318 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:21Z","lastTransitionTime":"2026-03-20T16:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.110413 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.110519 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.110628 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.110842 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.161234 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.161305 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.161323 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.161351 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.161368 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:21Z","lastTransitionTime":"2026-03-20T16:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.264206 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.264262 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.264276 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.264297 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.264307 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:21Z","lastTransitionTime":"2026-03-20T16:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.366809 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.366900 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.366912 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.366929 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.366941 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:21Z","lastTransitionTime":"2026-03-20T16:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.469387 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.469443 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.469461 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.469487 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.469502 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:21Z","lastTransitionTime":"2026-03-20T16:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.572320 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.572365 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.572374 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.572391 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.572403 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:21Z","lastTransitionTime":"2026-03-20T16:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.675643 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.675732 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.675747 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.675771 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.675786 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:21Z","lastTransitionTime":"2026-03-20T16:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.778932 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.778981 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.778992 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.779010 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.779020 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:21Z","lastTransitionTime":"2026-03-20T16:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.881635 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.881718 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.881737 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.881756 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.881767 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:21Z","lastTransitionTime":"2026-03-20T16:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.927653 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.927751 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.927779 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.927805 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.927870 4708 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.927933 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:02:37.927868044 +0000 UTC m=+112.602204759 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.927941 4708 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.928012 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:37.927981657 +0000 UTC m=+112.602318372 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.928063 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.927982 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.928089 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.928105 4708 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.928163 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.928227 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.928249 4708 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.928174 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:37.928116291 +0000 UTC m=+112.602453176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.928339 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:37.928318616 +0000 UTC m=+112.602655481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:21 crc kubenswrapper[4708]: E0320 16:02:21.928361 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:37.928349898 +0000 UTC m=+112.602686843 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.984512 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.984774 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.984788 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.984811 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:21 crc kubenswrapper[4708]: I0320 16:02:21.984826 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:21Z","lastTransitionTime":"2026-03-20T16:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.088620 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.088694 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.088705 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.088725 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.088738 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:22Z","lastTransitionTime":"2026-03-20T16:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.110367 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:22 crc kubenswrapper[4708]: E0320 16:02:22.110970 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:22 crc kubenswrapper[4708]: E0320 16:02:22.112383 4708 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:02:22 crc kubenswrapper[4708]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 16:02:22 crc kubenswrapper[4708]: set -o allexport Mar 20 16:02:22 crc kubenswrapper[4708]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 16:02:22 crc kubenswrapper[4708]: source /etc/kubernetes/apiserver-url.env Mar 20 16:02:22 crc kubenswrapper[4708]: else Mar 20 16:02:22 crc kubenswrapper[4708]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 16:02:22 crc kubenswrapper[4708]: exit 1 Mar 20 16:02:22 crc kubenswrapper[4708]: fi Mar 20 16:02:22 crc kubenswrapper[4708]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 16:02:22 crc kubenswrapper[4708]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 16:02:22 crc kubenswrapper[4708]: > logger="UnhandledError" Mar 20 16:02:22 crc kubenswrapper[4708]: E0320 16:02:22.112505 4708 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:02:22 crc kubenswrapper[4708]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 16:02:22 crc kubenswrapper[4708]: if [[ -f "/env/_master" ]]; then Mar 20 16:02:22 crc kubenswrapper[4708]: set -o allexport Mar 20 16:02:22 crc kubenswrapper[4708]: source "/env/_master" Mar 20 16:02:22 crc kubenswrapper[4708]: set +o allexport Mar 20 16:02:22 crc kubenswrapper[4708]: fi Mar 20 16:02:22 crc kubenswrapper[4708]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 16:02:22 crc kubenswrapper[4708]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 16:02:22 crc kubenswrapper[4708]: ho_enable="--enable-hybrid-overlay" Mar 20 16:02:22 crc kubenswrapper[4708]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 16:02:22 crc kubenswrapper[4708]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 16:02:22 crc kubenswrapper[4708]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 16:02:22 crc kubenswrapper[4708]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 16:02:22 crc kubenswrapper[4708]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 16:02:22 crc kubenswrapper[4708]: --webhook-host=127.0.0.1 \ Mar 20 16:02:22 crc kubenswrapper[4708]: --webhook-port=9743 \ Mar 20 16:02:22 crc kubenswrapper[4708]: ${ho_enable} \ Mar 20 16:02:22 crc kubenswrapper[4708]: --enable-interconnect \ Mar 20 16:02:22 crc kubenswrapper[4708]: --disable-approver \ Mar 20 16:02:22 crc kubenswrapper[4708]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 16:02:22 crc kubenswrapper[4708]: --wait-for-kubernetes-api=200s \ Mar 20 16:02:22 crc kubenswrapper[4708]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 16:02:22 crc kubenswrapper[4708]: --loglevel="${LOGLEVEL}" Mar 20 16:02:22 crc kubenswrapper[4708]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 16:02:22 crc kubenswrapper[4708]: > logger="UnhandledError" Mar 20 16:02:22 crc kubenswrapper[4708]: E0320 16:02:22.113906 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 16:02:22 crc kubenswrapper[4708]: E0320 16:02:22.115175 4708 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:02:22 crc kubenswrapper[4708]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 16:02:22 crc kubenswrapper[4708]: if [[ -f "/env/_master" ]]; then Mar 20 16:02:22 crc kubenswrapper[4708]: set -o allexport Mar 20 16:02:22 crc kubenswrapper[4708]: source "/env/_master" Mar 20 16:02:22 crc kubenswrapper[4708]: set +o allexport Mar 20 16:02:22 crc kubenswrapper[4708]: fi Mar 20 16:02:22 crc kubenswrapper[4708]: Mar 20 16:02:22 crc kubenswrapper[4708]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 16:02:22 crc kubenswrapper[4708]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 16:02:22 crc kubenswrapper[4708]: --disable-webhook \ Mar 20 16:02:22 crc kubenswrapper[4708]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 16:02:22 crc kubenswrapper[4708]: --loglevel="${LOGLEVEL}" Mar 20 16:02:22 crc kubenswrapper[4708]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 16:02:22 crc kubenswrapper[4708]: > logger="UnhandledError" Mar 20 16:02:22 crc kubenswrapper[4708]: E0320 16:02:22.116401 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.191113 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.191225 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.191255 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.191293 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.191322 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:22Z","lastTransitionTime":"2026-03-20T16:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.294225 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.294279 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.294292 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.294316 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.294327 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:22Z","lastTransitionTime":"2026-03-20T16:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.397437 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.397556 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.397580 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.397614 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.397636 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:22Z","lastTransitionTime":"2026-03-20T16:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.500213 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.500282 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.500300 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.500329 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.500347 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:22Z","lastTransitionTime":"2026-03-20T16:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.532640 4708 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.603944 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.604027 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.604049 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.604079 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.604099 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:22Z","lastTransitionTime":"2026-03-20T16:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.707890 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.707960 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.707975 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.707997 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.708014 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:22Z","lastTransitionTime":"2026-03-20T16:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.810621 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.810732 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.810758 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.810786 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.810808 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:22Z","lastTransitionTime":"2026-03-20T16:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.913909 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.913975 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.913985 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.914011 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:22 crc kubenswrapper[4708]: I0320 16:02:22.914024 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:22Z","lastTransitionTime":"2026-03-20T16:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.017623 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.017702 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.017719 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.017745 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.017762 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.110504 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.110607 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:23 crc kubenswrapper[4708]: E0320 16:02:23.110724 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:23 crc kubenswrapper[4708]: E0320 16:02:23.110842 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.121935 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.121978 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.121991 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.122009 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.122023 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.225495 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.225566 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.225587 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.225616 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.225639 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.328356 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.328460 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.328487 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.328523 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.328581 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.431776 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.431845 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.431866 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.431902 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.431916 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.534204 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.534249 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.534260 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.534277 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.534290 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.636833 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.636886 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.636901 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.636920 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.636936 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.739840 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.739921 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.739941 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.739978 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.740002 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.843303 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.843382 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.843408 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.843442 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.843466 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.940998 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.941062 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.941078 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.941109 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.941131 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: E0320 16:02:23.959188 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.963948 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.963996 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.964008 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.964030 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.964042 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: E0320 16:02:23.975540 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.979878 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.979912 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.979922 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.979937 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.979947 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:23 crc kubenswrapper[4708]: E0320 16:02:23.992080 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.996740 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.996769 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.996781 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.996795 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:23 crc kubenswrapper[4708]: I0320 16:02:23.996803 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:23Z","lastTransitionTime":"2026-03-20T16:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:24 crc kubenswrapper[4708]: E0320 16:02:24.053396 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.060027 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.060066 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.060078 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.060096 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.060110 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:24Z","lastTransitionTime":"2026-03-20T16:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:24 crc kubenswrapper[4708]: E0320 16:02:24.077275 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:24 crc kubenswrapper[4708]: E0320 16:02:24.077436 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.079322 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.079348 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.079359 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.079376 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.079392 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:24Z","lastTransitionTime":"2026-03-20T16:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.110176 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:24 crc kubenswrapper[4708]: E0320 16:02:24.110304 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.181964 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.182021 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.182031 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.182049 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.182061 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:24Z","lastTransitionTime":"2026-03-20T16:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.292367 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.292432 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.292445 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.292472 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.292489 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:24Z","lastTransitionTime":"2026-03-20T16:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.395831 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.395892 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.395910 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.395935 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.395956 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:24Z","lastTransitionTime":"2026-03-20T16:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.499080 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.499128 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.499142 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.499161 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.499175 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:24Z","lastTransitionTime":"2026-03-20T16:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.601741 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.601824 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.601838 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.601884 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.601895 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:24Z","lastTransitionTime":"2026-03-20T16:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.704786 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.704832 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.704846 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.704867 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.704879 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:24Z","lastTransitionTime":"2026-03-20T16:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.807785 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.807849 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.807864 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.807887 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.807901 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:24Z","lastTransitionTime":"2026-03-20T16:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.911413 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.911482 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.911497 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.911525 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:24 crc kubenswrapper[4708]: I0320 16:02:24.911542 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:24Z","lastTransitionTime":"2026-03-20T16:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.014855 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.015219 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.015294 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.015380 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.015451 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:25Z","lastTransitionTime":"2026-03-20T16:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.110930 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.110978 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:25 crc kubenswrapper[4708]: E0320 16:02:25.111184 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:25 crc kubenswrapper[4708]: E0320 16:02:25.111324 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.119150 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.119476 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.119543 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.119607 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.119869 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:25Z","lastTransitionTime":"2026-03-20T16:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.222622 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.222693 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.222708 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.222732 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.222747 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:25Z","lastTransitionTime":"2026-03-20T16:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.325817 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.326172 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.326261 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.326345 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.326404 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:25Z","lastTransitionTime":"2026-03-20T16:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.429813 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.430137 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.430243 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.430344 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.430539 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:25Z","lastTransitionTime":"2026-03-20T16:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.489993 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4jhr8"] Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.490530 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4jhr8" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.493512 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.494213 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.494798 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.512747 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.527138 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.534020 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.534071 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.534086 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.534106 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.534119 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:25Z","lastTransitionTime":"2026-03-20T16:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.537217 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.562998 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/83526619-9c43-409d-af71-0b8ebbe71231-hosts-file\") pod \"node-resolver-4jhr8\" (UID: \"83526619-9c43-409d-af71-0b8ebbe71231\") " pod="openshift-dns/node-resolver-4jhr8" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.563069 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x788b\" (UniqueName: \"kubernetes.io/projected/83526619-9c43-409d-af71-0b8ebbe71231-kube-api-access-x788b\") pod \"node-resolver-4jhr8\" (UID: \"83526619-9c43-409d-af71-0b8ebbe71231\") " pod="openshift-dns/node-resolver-4jhr8" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.567368 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.582893 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.596425 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.609524 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.620182 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.630465 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.637121 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.637409 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.637507 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.637615 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.637705 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:25Z","lastTransitionTime":"2026-03-20T16:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.640650 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.663909 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x788b\" (UniqueName: \"kubernetes.io/projected/83526619-9c43-409d-af71-0b8ebbe71231-kube-api-access-x788b\") pod \"node-resolver-4jhr8\" (UID: \"83526619-9c43-409d-af71-0b8ebbe71231\") " pod="openshift-dns/node-resolver-4jhr8" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.663989 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/83526619-9c43-409d-af71-0b8ebbe71231-hosts-file\") pod \"node-resolver-4jhr8\" (UID: \"83526619-9c43-409d-af71-0b8ebbe71231\") " pod="openshift-dns/node-resolver-4jhr8" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.664074 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/83526619-9c43-409d-af71-0b8ebbe71231-hosts-file\") pod \"node-resolver-4jhr8\" (UID: \"83526619-9c43-409d-af71-0b8ebbe71231\") " pod="openshift-dns/node-resolver-4jhr8" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.684311 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x788b\" (UniqueName: \"kubernetes.io/projected/83526619-9c43-409d-af71-0b8ebbe71231-kube-api-access-x788b\") pod \"node-resolver-4jhr8\" (UID: \"83526619-9c43-409d-af71-0b8ebbe71231\") " pod="openshift-dns/node-resolver-4jhr8" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.740190 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.740252 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.740269 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.740294 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.740310 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:25Z","lastTransitionTime":"2026-03-20T16:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.807982 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4jhr8" Mar 20 16:02:25 crc kubenswrapper[4708]: W0320 16:02:25.822903 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83526619_9c43_409d_af71_0b8ebbe71231.slice/crio-c173d73f3c2b4c900fac0ba3efcdc10b322ca7f2d49b34805b300761b29255bf WatchSource:0}: Error finding container c173d73f3c2b4c900fac0ba3efcdc10b322ca7f2d49b34805b300761b29255bf: Status 404 returned error can't find the container with id c173d73f3c2b4c900fac0ba3efcdc10b322ca7f2d49b34805b300761b29255bf Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.843955 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8kspl"] Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.844322 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.845645 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-m98sv"] Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.846040 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-sgbv9"] Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.846243 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.846524 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.848356 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.848562 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.848700 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.848901 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.849005 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.849167 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.849293 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.849503 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.849518 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.850406 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.850536 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.850579 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.850594 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.850618 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.850638 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:25Z","lastTransitionTime":"2026-03-20T16:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.851248 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.851468 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865501 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9ddc889-df44-41f4-bb84-bb103bf9695a-os-release\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865553 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9ddc889-df44-41f4-bb84-bb103bf9695a-cnibin\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865622 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbd987d1-f981-4e7a-b063-920f84a0d7f6-proxy-tls\") pod \"machine-config-daemon-sgbv9\" (UID: \"fbd987d1-f981-4e7a-b063-920f84a0d7f6\") " pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865721 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-run-multus-certs\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865754 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-multus-socket-dir-parent\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865784 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-cnibin\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865807 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-run-k8s-cni-cncf-io\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865840 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fbd987d1-f981-4e7a-b063-920f84a0d7f6-rootfs\") pod \"machine-config-daemon-sgbv9\" (UID: \"fbd987d1-f981-4e7a-b063-920f84a0d7f6\") " pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865872 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9ddc889-df44-41f4-bb84-bb103bf9695a-system-cni-dir\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865898 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtslg\" (UniqueName: \"kubernetes.io/projected/fbd987d1-f981-4e7a-b063-920f84a0d7f6-kube-api-access-rtslg\") pod \"machine-config-daemon-sgbv9\" (UID: \"fbd987d1-f981-4e7a-b063-920f84a0d7f6\") " pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865922 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-var-lib-cni-multus\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865945 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-hostroot\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865971 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfdz5\" (UniqueName: \"kubernetes.io/projected/b9ddc889-df44-41f4-bb84-bb103bf9695a-kube-api-access-cfdz5\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.865995 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-os-release\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866024 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh9ff\" (UniqueName: \"kubernetes.io/projected/f49a68df-98d0-464f-b40e-0aba2faab528-kube-api-access-nh9ff\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866124 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-multus-cni-dir\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866188 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-var-lib-cni-bin\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866209 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f49a68df-98d0-464f-b40e-0aba2faab528-multus-daemon-config\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866233 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9ddc889-df44-41f4-bb84-bb103bf9695a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866299 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fbd987d1-f981-4e7a-b063-920f84a0d7f6-mcd-auth-proxy-config\") pod \"machine-config-daemon-sgbv9\" (UID: \"fbd987d1-f981-4e7a-b063-920f84a0d7f6\") " pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866360 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-system-cni-dir\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866411 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9ddc889-df44-41f4-bb84-bb103bf9695a-cni-binary-copy\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866437 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f49a68df-98d0-464f-b40e-0aba2faab528-cni-binary-copy\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866456 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-run-netns\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866496 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-var-lib-kubelet\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866542 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-etc-kubernetes\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866578 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-multus-conf-dir\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.866606 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9ddc889-df44-41f4-bb84-bb103bf9695a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.869821 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.884070 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.895057 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.911191 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.926921 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.938265 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.946421 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.952957 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.953004 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.953015 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.953033 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.953044 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:25Z","lastTransitionTime":"2026-03-20T16:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.958510 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967582 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fbd987d1-f981-4e7a-b063-920f84a0d7f6-rootfs\") pod \"machine-config-daemon-sgbv9\" (UID: \"fbd987d1-f981-4e7a-b063-920f84a0d7f6\") " pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967634 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-cnibin\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967656 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-run-k8s-cni-cncf-io\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967700 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9ddc889-df44-41f4-bb84-bb103bf9695a-system-cni-dir\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967723 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfdz5\" (UniqueName: \"kubernetes.io/projected/b9ddc889-df44-41f4-bb84-bb103bf9695a-kube-api-access-cfdz5\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967729 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fbd987d1-f981-4e7a-b063-920f84a0d7f6-rootfs\") pod \"machine-config-daemon-sgbv9\" (UID: \"fbd987d1-f981-4e7a-b063-920f84a0d7f6\") " pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967765 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-cnibin\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967746 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtslg\" (UniqueName: \"kubernetes.io/projected/fbd987d1-f981-4e7a-b063-920f84a0d7f6-kube-api-access-rtslg\") pod \"machine-config-daemon-sgbv9\" (UID: \"fbd987d1-f981-4e7a-b063-920f84a0d7f6\") " pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967811 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-run-k8s-cni-cncf-io\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967835 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-var-lib-cni-multus\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967860 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-hostroot\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967883 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh9ff\" (UniqueName: \"kubernetes.io/projected/f49a68df-98d0-464f-b40e-0aba2faab528-kube-api-access-nh9ff\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967888 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-hostroot\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967898 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-os-release\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967939 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-multus-cni-dir\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967971 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9ddc889-df44-41f4-bb84-bb103bf9695a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967995 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-var-lib-cni-bin\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967792 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9ddc889-df44-41f4-bb84-bb103bf9695a-system-cni-dir\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968016 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f49a68df-98d0-464f-b40e-0aba2faab528-multus-daemon-config\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968049 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9ddc889-df44-41f4-bb84-bb103bf9695a-cni-binary-copy\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968070 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fbd987d1-f981-4e7a-b063-920f84a0d7f6-mcd-auth-proxy-config\") pod \"machine-config-daemon-sgbv9\" (UID: \"fbd987d1-f981-4e7a-b063-920f84a0d7f6\") " pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968087 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-system-cni-dir\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968108 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-run-netns\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968129 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f49a68df-98d0-464f-b40e-0aba2faab528-cni-binary-copy\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968150 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-var-lib-kubelet\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968170 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-etc-kubernetes\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968202 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9ddc889-df44-41f4-bb84-bb103bf9695a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968221 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-multus-conf-dir\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968258 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9ddc889-df44-41f4-bb84-bb103bf9695a-os-release\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968288 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9ddc889-df44-41f4-bb84-bb103bf9695a-cnibin\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968308 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbd987d1-f981-4e7a-b063-920f84a0d7f6-proxy-tls\") pod \"machine-config-daemon-sgbv9\" (UID: \"fbd987d1-f981-4e7a-b063-920f84a0d7f6\") " pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968328 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-run-multus-certs\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968349 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-multus-socket-dir-parent\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968419 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-multus-socket-dir-parent\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968459 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-system-cni-dir\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968489 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-run-netns\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968873 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9ddc889-df44-41f4-bb84-bb103bf9695a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968884 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f49a68df-98d0-464f-b40e-0aba2faab528-multus-daemon-config\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968904 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fbd987d1-f981-4e7a-b063-920f84a0d7f6-mcd-auth-proxy-config\") pod \"machine-config-daemon-sgbv9\" (UID: \"fbd987d1-f981-4e7a-b063-920f84a0d7f6\") " pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967862 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-var-lib-cni-multus\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968932 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-var-lib-cni-bin\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.968979 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-multus-conf-dir\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.969002 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-var-lib-kubelet\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.969021 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-etc-kubernetes\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.969072 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f49a68df-98d0-464f-b40e-0aba2faab528-cni-binary-copy\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.969084 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-multus-cni-dir\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.967944 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-os-release\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.969138 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9ddc889-df44-41f4-bb84-bb103bf9695a-os-release\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.969172 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f49a68df-98d0-464f-b40e-0aba2faab528-host-run-multus-certs\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.969204 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9ddc889-df44-41f4-bb84-bb103bf9695a-cnibin\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.969464 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9ddc889-df44-41f4-bb84-bb103bf9695a-cni-binary-copy\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.970555 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9ddc889-df44-41f4-bb84-bb103bf9695a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.971868 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.973642 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbd987d1-f981-4e7a-b063-920f84a0d7f6-proxy-tls\") pod \"machine-config-daemon-sgbv9\" (UID: \"fbd987d1-f981-4e7a-b063-920f84a0d7f6\") " pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.984966 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfdz5\" (UniqueName: \"kubernetes.io/projected/b9ddc889-df44-41f4-bb84-bb103bf9695a-kube-api-access-cfdz5\") pod \"multus-additional-cni-plugins-m98sv\" (UID: \"b9ddc889-df44-41f4-bb84-bb103bf9695a\") " pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.985377 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh9ff\" (UniqueName: \"kubernetes.io/projected/f49a68df-98d0-464f-b40e-0aba2faab528-kube-api-access-nh9ff\") pod \"multus-8kspl\" (UID: \"f49a68df-98d0-464f-b40e-0aba2faab528\") " pod="openshift-multus/multus-8kspl" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.985459 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.988579 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtslg\" (UniqueName: \"kubernetes.io/projected/fbd987d1-f981-4e7a-b063-920f84a0d7f6-kube-api-access-rtslg\") pod \"machine-config-daemon-sgbv9\" (UID: \"fbd987d1-f981-4e7a-b063-920f84a0d7f6\") " pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:25 crc kubenswrapper[4708]: I0320 16:02:25.997834 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.006765 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.019706 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.029978 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.041008 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.051977 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.056285 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.056318 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.056330 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.056348 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.056360 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:26Z","lastTransitionTime":"2026-03-20T16:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.064255 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.082218 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.092546 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.105307 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.113989 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:26 crc kubenswrapper[4708]: E0320 16:02:26.114174 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.118647 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.128830 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.139306 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.153538 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.159048 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.159099 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.159119 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.159145 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.159163 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:26Z","lastTransitionTime":"2026-03-20T16:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.162072 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.170124 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8kspl" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.177433 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.179296 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: W0320 16:02:26.182234 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49a68df_98d0_464f_b40e_0aba2faab528.slice/crio-b6d694928aa56e1458693774a00c5a1569f7f38dd50c7009993585627cf265b1 WatchSource:0}: Error finding container b6d694928aa56e1458693774a00c5a1569f7f38dd50c7009993585627cf265b1: Status 404 returned error can't find the container with id b6d694928aa56e1458693774a00c5a1569f7f38dd50c7009993585627cf265b1 Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.184048 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m98sv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.191269 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: W0320 16:02:26.193822 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd987d1_f981_4e7a_b063_920f84a0d7f6.slice/crio-2134e424749b0cee33215100fc95b6daf80b4c9a9baf34856b979df9ee5e40cb WatchSource:0}: Error finding container 2134e424749b0cee33215100fc95b6daf80b4c9a9baf34856b979df9ee5e40cb: Status 404 returned error can't find the container with id 2134e424749b0cee33215100fc95b6daf80b4c9a9baf34856b979df9ee5e40cb Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.203812 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: W0320 16:02:26.214264 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9ddc889_df44_41f4_bb84_bb103bf9695a.slice/crio-a18238408f1e38bd74881985c5a0f813a6b94758a3c3e87cb095277b01872b90 WatchSource:0}: Error finding container a18238408f1e38bd74881985c5a0f813a6b94758a3c3e87cb095277b01872b90: Status 404 returned error can't find the container with id a18238408f1e38bd74881985c5a0f813a6b94758a3c3e87cb095277b01872b90 Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.217255 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rcmhv"] Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.218225 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.220294 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.220809 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.221613 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.221761 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.221806 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.221835 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.223943 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.226539 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.243339 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.261635 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.262764 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.262808 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.262820 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.262837 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.262846 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:26Z","lastTransitionTime":"2026-03-20T16:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.271627 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovn-node-metrics-cert\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.271811 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-cni-bin\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.271909 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-systemd-units\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.272030 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-ovn\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.272135 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovnkube-script-lib\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.272259 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-openvswitch\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.272369 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovnkube-config\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.272498 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fwwd\" (UniqueName: \"kubernetes.io/projected/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-kube-api-access-8fwwd\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.272600 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-run-netns\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.272705 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-etc-openvswitch\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.273047 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.273172 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-log-socket\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.273299 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-cni-netd\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.273409 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-env-overrides\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.273543 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-node-log\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.273744 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-slash\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.273854 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-var-lib-openvswitch\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.273966 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-kubelet\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.274048 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-systemd\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.274250 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-run-ovn-kubernetes\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.275602 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.300404 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.312859 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.326642 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.329646 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8kspl" event={"ID":"f49a68df-98d0-464f-b40e-0aba2faab528","Type":"ContainerStarted","Data":"527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.329708 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8kspl" event={"ID":"f49a68df-98d0-464f-b40e-0aba2faab528","Type":"ContainerStarted","Data":"b6d694928aa56e1458693774a00c5a1569f7f38dd50c7009993585627cf265b1"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.331981 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" event={"ID":"b9ddc889-df44-41f4-bb84-bb103bf9695a","Type":"ContainerStarted","Data":"a18238408f1e38bd74881985c5a0f813a6b94758a3c3e87cb095277b01872b90"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.334018 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4jhr8" event={"ID":"83526619-9c43-409d-af71-0b8ebbe71231","Type":"ContainerStarted","Data":"ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.334048 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4jhr8" event={"ID":"83526619-9c43-409d-af71-0b8ebbe71231","Type":"ContainerStarted","Data":"c173d73f3c2b4c900fac0ba3efcdc10b322ca7f2d49b34805b300761b29255bf"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.336166 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerStarted","Data":"d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.336193 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerStarted","Data":"2134e424749b0cee33215100fc95b6daf80b4c9a9baf34856b979df9ee5e40cb"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.338181 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.357740 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.366632 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.366780 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.366797 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.366819 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.366832 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:26Z","lastTransitionTime":"2026-03-20T16:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.369963 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375508 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-slash\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375588 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-var-lib-openvswitch\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375622 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-run-ovn-kubernetes\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375644 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-kubelet\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375661 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-systemd\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375728 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovn-node-metrics-cert\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375754 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-systemd-units\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375775 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-ovn\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375797 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-cni-bin\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375820 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovnkube-script-lib\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375868 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-openvswitch\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375893 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovnkube-config\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375923 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fwwd\" (UniqueName: \"kubernetes.io/projected/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-kube-api-access-8fwwd\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375956 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-etc-openvswitch\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.375991 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-run-netns\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.376013 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-log-socket\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.376030 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-cni-netd\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.376053 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.376072 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-env-overrides\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.376090 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-node-log\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.376156 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-node-log\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.376197 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-slash\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.377848 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-run-netns\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.377907 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-openvswitch\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.379933 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-cni-netd\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.377993 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-kubelet\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.380170 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-ovn\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.380195 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-etc-openvswitch\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.380146 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-systemd\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.380258 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.380282 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-cni-bin\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.380529 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-log-socket\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.377968 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-systemd-units\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.380651 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-var-lib-openvswitch\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.377985 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-run-ovn-kubernetes\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.380899 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovnkube-config\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.382079 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-env-overrides\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.382260 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovnkube-script-lib\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.383447 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.394536 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.397146 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovn-node-metrics-cert\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.397512 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fwwd\" (UniqueName: \"kubernetes.io/projected/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-kube-api-access-8fwwd\") pod \"ovnkube-node-rcmhv\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.407443 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.421157 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.432947 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.457176 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.469802 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.469885 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.469899 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.469927 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.469940 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:26Z","lastTransitionTime":"2026-03-20T16:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.470160 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.481938 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.492292 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.502802 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.511541 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.521107 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.533450 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.539106 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.571996 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.572036 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.572047 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.572066 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.572077 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:26Z","lastTransitionTime":"2026-03-20T16:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.674627 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.674714 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.674734 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.674761 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.674804 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:26Z","lastTransitionTime":"2026-03-20T16:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.779052 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.779101 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.779113 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.779134 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.779148 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:26Z","lastTransitionTime":"2026-03-20T16:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.882398 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.882489 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.882506 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.882571 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.882591 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:26Z","lastTransitionTime":"2026-03-20T16:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.984922 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.984972 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.984983 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.985002 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:26 crc kubenswrapper[4708]: I0320 16:02:26.985014 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:26Z","lastTransitionTime":"2026-03-20T16:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.087722 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.087792 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.087802 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.088019 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.088029 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:27Z","lastTransitionTime":"2026-03-20T16:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.110070 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.110208 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:27 crc kubenswrapper[4708]: E0320 16:02:27.110382 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:27 crc kubenswrapper[4708]: E0320 16:02:27.110931 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.111364 4708 scope.go:117] "RemoveContainer" containerID="8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670" Mar 20 16:02:27 crc kubenswrapper[4708]: E0320 16:02:27.111836 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.191320 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.191366 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.191380 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.191398 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.191412 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:27Z","lastTransitionTime":"2026-03-20T16:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.300965 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.301439 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.301452 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.301474 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.301484 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:27Z","lastTransitionTime":"2026-03-20T16:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.340504 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2" exitCode=0 Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.340599 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.340653 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"dda90ce3aeec8c31d026f5fe2271d3ced123bdc26d8194deacf44f8f8cf9513d"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.344625 4708 generic.go:334] "Generic (PLEG): container finished" podID="b9ddc889-df44-41f4-bb84-bb103bf9695a" containerID="10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de" exitCode=0 Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.344738 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" event={"ID":"b9ddc889-df44-41f4-bb84-bb103bf9695a","Type":"ContainerDied","Data":"10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.351318 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerStarted","Data":"b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.352654 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.366048 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.375869 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.386540 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.404073 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.404110 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.404119 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.404133 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.404144 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:27Z","lastTransitionTime":"2026-03-20T16:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.405092 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.417746 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.426292 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.435379 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.444928 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.456764 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.468228 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.486589 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.497114 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.508592 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.508730 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.508779 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.508806 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.508823 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:27Z","lastTransitionTime":"2026-03-20T16:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.513062 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.523384 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.533999 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.547273 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.557607 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.570962 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.584497 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.602348 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.611769 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.611891 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.611988 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.611795 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.612061 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.612418 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:27Z","lastTransitionTime":"2026-03-20T16:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.626169 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.636552 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.648642 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.656281 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.663699 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.679580 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.715203 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.715288 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.715315 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.715356 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.715383 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:27Z","lastTransitionTime":"2026-03-20T16:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.818801 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.819230 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.819245 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.819269 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.819288 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:27Z","lastTransitionTime":"2026-03-20T16:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.922305 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.922345 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.922357 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.922373 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:27 crc kubenswrapper[4708]: I0320 16:02:27.922386 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:27Z","lastTransitionTime":"2026-03-20T16:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.025765 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.025823 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.025840 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.025862 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.025878 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:28Z","lastTransitionTime":"2026-03-20T16:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.110626 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:28 crc kubenswrapper[4708]: E0320 16:02:28.110762 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.127438 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.127738 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.127828 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.127910 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.127974 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:28Z","lastTransitionTime":"2026-03-20T16:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.230823 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.230983 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.230994 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.231028 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.231037 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:28Z","lastTransitionTime":"2026-03-20T16:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.333720 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.333821 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.333848 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.333883 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.333911 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:28Z","lastTransitionTime":"2026-03-20T16:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.358977 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.359035 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.359048 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.359060 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.359070 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.362557 4708 generic.go:334] "Generic (PLEG): container finished" podID="b9ddc889-df44-41f4-bb84-bb103bf9695a" containerID="d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7" exitCode=0 Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.362696 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" event={"ID":"b9ddc889-df44-41f4-bb84-bb103bf9695a","Type":"ContainerDied","Data":"d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.374828 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.388391 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.399892 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.412609 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.427895 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.437172 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.437220 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.437235 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.437271 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.437288 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:28Z","lastTransitionTime":"2026-03-20T16:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.438661 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.453992 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.467434 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.477476 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.486648 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.497183 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.518077 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.529743 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.540241 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.540277 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.540285 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.540301 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.540310 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:28Z","lastTransitionTime":"2026-03-20T16:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.541889 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.643595 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.643628 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.643640 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.643656 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.643684 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:28Z","lastTransitionTime":"2026-03-20T16:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.746035 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.746075 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.746085 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.746104 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.746115 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:28Z","lastTransitionTime":"2026-03-20T16:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.848379 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.848414 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.848443 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.848459 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.848469 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:28Z","lastTransitionTime":"2026-03-20T16:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.951080 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.951106 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.951116 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.951128 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:28 crc kubenswrapper[4708]: I0320 16:02:28.951137 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:28Z","lastTransitionTime":"2026-03-20T16:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.053609 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.053745 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.053772 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.053802 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.053824 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:29Z","lastTransitionTime":"2026-03-20T16:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.110041 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:29 crc kubenswrapper[4708]: E0320 16:02:29.110258 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.110040 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:29 crc kubenswrapper[4708]: E0320 16:02:29.111112 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.157191 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.157242 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.157254 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.157276 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.157290 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:29Z","lastTransitionTime":"2026-03-20T16:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.260339 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.260385 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.260441 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.260466 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.260887 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:29Z","lastTransitionTime":"2026-03-20T16:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.363788 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.363842 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.363856 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.363875 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.363889 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:29Z","lastTransitionTime":"2026-03-20T16:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.368906 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84"} Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.371059 4708 generic.go:334] "Generic (PLEG): container finished" podID="b9ddc889-df44-41f4-bb84-bb103bf9695a" containerID="7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d" exitCode=0 Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.371095 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" event={"ID":"b9ddc889-df44-41f4-bb84-bb103bf9695a","Type":"ContainerDied","Data":"7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d"} Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.389042 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.405977 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.426331 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.436575 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.444979 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.460779 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.465997 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.466035 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.466045 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.466079 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.466090 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:29Z","lastTransitionTime":"2026-03-20T16:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.490576 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.501361 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.513519 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.527341 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.537878 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.549101 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.561277 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.568857 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.568895 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.568909 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.568927 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.568940 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:29Z","lastTransitionTime":"2026-03-20T16:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.572466 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.671589 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.672093 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.672104 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.672121 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.672131 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:29Z","lastTransitionTime":"2026-03-20T16:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.776821 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.776864 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.776873 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.776890 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.776920 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:29Z","lastTransitionTime":"2026-03-20T16:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.881989 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.882072 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.882097 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.882127 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.882148 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:29Z","lastTransitionTime":"2026-03-20T16:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.984935 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.984976 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.984986 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.985003 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:29 crc kubenswrapper[4708]: I0320 16:02:29.985014 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:29Z","lastTransitionTime":"2026-03-20T16:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.087977 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.088027 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.088039 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.088059 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.088072 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:30Z","lastTransitionTime":"2026-03-20T16:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.110656 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:30 crc kubenswrapper[4708]: E0320 16:02:30.110874 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.190565 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.190614 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.190628 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.190645 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.190655 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:30Z","lastTransitionTime":"2026-03-20T16:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.293199 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.293281 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.293305 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.293334 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.293351 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:30Z","lastTransitionTime":"2026-03-20T16:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.379849 4708 generic.go:334] "Generic (PLEG): container finished" podID="b9ddc889-df44-41f4-bb84-bb103bf9695a" containerID="954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89" exitCode=0 Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.379907 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" event={"ID":"b9ddc889-df44-41f4-bb84-bb103bf9695a","Type":"ContainerDied","Data":"954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89"} Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.392043 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.397507 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.397583 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.397625 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.397652 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.397705 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:30Z","lastTransitionTime":"2026-03-20T16:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.411726 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.423887 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.439965 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.451395 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.473717 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.491600 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.510940 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.510990 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.510999 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.511015 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.511026 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:30Z","lastTransitionTime":"2026-03-20T16:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.518246 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.540915 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.554393 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.566370 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.584821 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.599886 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.613691 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.613727 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.613740 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.613760 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.613773 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:30Z","lastTransitionTime":"2026-03-20T16:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.613987 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.715932 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.715982 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.715995 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.716016 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.716029 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:30Z","lastTransitionTime":"2026-03-20T16:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.818579 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.818620 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.818635 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.818652 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.818661 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:30Z","lastTransitionTime":"2026-03-20T16:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.921132 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.921181 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.921194 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.921211 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:30 crc kubenswrapper[4708]: I0320 16:02:30.921222 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:30Z","lastTransitionTime":"2026-03-20T16:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.023917 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.023965 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.023979 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.023997 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.024011 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:31Z","lastTransitionTime":"2026-03-20T16:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.110866 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.110943 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:31 crc kubenswrapper[4708]: E0320 16:02:31.111035 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:31 crc kubenswrapper[4708]: E0320 16:02:31.111119 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.126651 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.126725 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.126738 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.126761 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.126773 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:31Z","lastTransitionTime":"2026-03-20T16:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.231790 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.231845 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.231856 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.231872 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.231882 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:31Z","lastTransitionTime":"2026-03-20T16:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.335626 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.336117 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.336132 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.336152 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.336167 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:31Z","lastTransitionTime":"2026-03-20T16:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.387694 4708 generic.go:334] "Generic (PLEG): container finished" podID="b9ddc889-df44-41f4-bb84-bb103bf9695a" containerID="3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284" exitCode=0 Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.387767 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" event={"ID":"b9ddc889-df44-41f4-bb84-bb103bf9695a","Type":"ContainerDied","Data":"3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.395351 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.400319 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.415811 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.428149 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.442015 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.442882 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.442972 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.442994 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.443024 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.443049 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:31Z","lastTransitionTime":"2026-03-20T16:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.461707 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.474575 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.494817 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.507179 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.522380 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.533148 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.542727 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.545926 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.546001 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.546019 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.546042 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.546055 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:31Z","lastTransitionTime":"2026-03-20T16:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.551807 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.562081 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.583470 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.649702 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.649733 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.649741 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.649756 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.649766 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:31Z","lastTransitionTime":"2026-03-20T16:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.752495 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.752522 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.752532 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.752545 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.752554 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:31Z","lastTransitionTime":"2026-03-20T16:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.856491 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.856548 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.856566 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.856583 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.856596 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:31Z","lastTransitionTime":"2026-03-20T16:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.959656 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.959762 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.959783 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.960199 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.960257 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:31Z","lastTransitionTime":"2026-03-20T16:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.987068 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-trmk9"] Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.987641 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-trmk9" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.991009 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.991139 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.993118 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 16:02:31 crc kubenswrapper[4708]: I0320 16:02:31.993218 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.004169 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.024888 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.038114 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.039571 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5p8q\" (UniqueName: \"kubernetes.io/projected/2e14ba83-1670-47d0-85b5-a1e8fbdafe62-kube-api-access-v5p8q\") pod \"node-ca-trmk9\" (UID: \"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\") " pod="openshift-image-registry/node-ca-trmk9" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.039628 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2e14ba83-1670-47d0-85b5-a1e8fbdafe62-serviceca\") pod \"node-ca-trmk9\" (UID: \"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\") " pod="openshift-image-registry/node-ca-trmk9" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.039815 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e14ba83-1670-47d0-85b5-a1e8fbdafe62-host\") pod \"node-ca-trmk9\" (UID: \"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\") " pod="openshift-image-registry/node-ca-trmk9" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.052716 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.065832 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.065892 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.065911 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.065936 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.065954 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:32Z","lastTransitionTime":"2026-03-20T16:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.066822 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.082987 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.092768 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.107953 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.110042 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:32 crc kubenswrapper[4708]: E0320 16:02:32.110227 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.124594 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.133026 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.141216 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2e14ba83-1670-47d0-85b5-a1e8fbdafe62-serviceca\") pod \"node-ca-trmk9\" (UID: \"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\") " pod="openshift-image-registry/node-ca-trmk9" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.141274 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e14ba83-1670-47d0-85b5-a1e8fbdafe62-host\") pod \"node-ca-trmk9\" (UID: \"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\") " pod="openshift-image-registry/node-ca-trmk9" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.141304 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5p8q\" (UniqueName: \"kubernetes.io/projected/2e14ba83-1670-47d0-85b5-a1e8fbdafe62-kube-api-access-v5p8q\") pod \"node-ca-trmk9\" (UID: \"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\") " pod="openshift-image-registry/node-ca-trmk9" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.141801 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.141833 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e14ba83-1670-47d0-85b5-a1e8fbdafe62-host\") pod \"node-ca-trmk9\" (UID: \"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\") " pod="openshift-image-registry/node-ca-trmk9" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.142640 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2e14ba83-1670-47d0-85b5-a1e8fbdafe62-serviceca\") pod \"node-ca-trmk9\" (UID: \"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\") " pod="openshift-image-registry/node-ca-trmk9" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.148313 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.160526 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5p8q\" (UniqueName: \"kubernetes.io/projected/2e14ba83-1670-47d0-85b5-a1e8fbdafe62-kube-api-access-v5p8q\") pod \"node-ca-trmk9\" (UID: \"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\") " pod="openshift-image-registry/node-ca-trmk9" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.162275 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.173963 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.174004 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.174015 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.174032 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.174043 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:32Z","lastTransitionTime":"2026-03-20T16:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.175753 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.186644 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.277440 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.277477 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.277491 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.277513 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.277527 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:32Z","lastTransitionTime":"2026-03-20T16:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.306780 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-trmk9" Mar 20 16:02:32 crc kubenswrapper[4708]: W0320 16:02:32.327836 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e14ba83_1670_47d0_85b5_a1e8fbdafe62.slice/crio-df8a39b83eeb36611a4fd610960d74d0009f813017ea7a32e7fa9afff3f52f4b WatchSource:0}: Error finding container df8a39b83eeb36611a4fd610960d74d0009f813017ea7a32e7fa9afff3f52f4b: Status 404 returned error can't find the container with id df8a39b83eeb36611a4fd610960d74d0009f813017ea7a32e7fa9afff3f52f4b Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.380337 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.380376 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.380388 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.380405 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.380418 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:32Z","lastTransitionTime":"2026-03-20T16:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.399733 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-trmk9" event={"ID":"2e14ba83-1670-47d0-85b5-a1e8fbdafe62","Type":"ContainerStarted","Data":"df8a39b83eeb36611a4fd610960d74d0009f813017ea7a32e7fa9afff3f52f4b"} Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.404903 4708 generic.go:334] "Generic (PLEG): container finished" podID="b9ddc889-df44-41f4-bb84-bb103bf9695a" containerID="5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397" exitCode=0 Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.404989 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" event={"ID":"b9ddc889-df44-41f4-bb84-bb103bf9695a","Type":"ContainerDied","Data":"5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397"} Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.426095 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.440114 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.459533 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.469898 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.482392 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.482450 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.482472 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.482499 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.482517 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:32Z","lastTransitionTime":"2026-03-20T16:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.487237 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.496293 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.506860 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.522254 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.532823 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.543658 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.550776 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.562065 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.571928 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.581353 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.584704 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.584734 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.584744 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.584758 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.584767 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:32Z","lastTransitionTime":"2026-03-20T16:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.596026 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.687768 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.687809 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.687822 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.687841 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.687855 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:32Z","lastTransitionTime":"2026-03-20T16:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.789944 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.789985 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.789997 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.790013 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.790023 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:32Z","lastTransitionTime":"2026-03-20T16:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.892613 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.892664 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.892694 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.892712 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:32 crc kubenswrapper[4708]: I0320 16:02:32.892724 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:32Z","lastTransitionTime":"2026-03-20T16:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:32.994611 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:32.994639 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:32.994647 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:32.994660 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:32.994688 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:32Z","lastTransitionTime":"2026-03-20T16:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.096371 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.096419 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.096431 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.096447 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.096458 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:33Z","lastTransitionTime":"2026-03-20T16:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.110209 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.110239 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:33 crc kubenswrapper[4708]: E0320 16:02:33.110306 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:33 crc kubenswrapper[4708]: E0320 16:02:33.110368 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.200123 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.200182 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.200196 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.200225 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.200243 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:33Z","lastTransitionTime":"2026-03-20T16:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.304635 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.304707 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.304721 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.304740 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.304753 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:33Z","lastTransitionTime":"2026-03-20T16:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.415076 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.415122 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.415143 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.415164 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.415181 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:33Z","lastTransitionTime":"2026-03-20T16:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.422785 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.423016 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.423086 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.423099 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.427141 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-trmk9" event={"ID":"2e14ba83-1670-47d0-85b5-a1e8fbdafe62","Type":"ContainerStarted","Data":"aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.431938 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" event={"ID":"b9ddc889-df44-41f4-bb84-bb103bf9695a","Type":"ContainerStarted","Data":"642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.436922 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.455118 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.455171 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.455736 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.470900 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.484006 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.495467 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.504247 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.517541 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.517605 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.517623 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.517649 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.517694 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:33Z","lastTransitionTime":"2026-03-20T16:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.519538 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.529524 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.540403 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.549936 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.561806 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.574885 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.590965 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.606975 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.620592 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.620813 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.620937 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.621034 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.621146 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:33Z","lastTransitionTime":"2026-03-20T16:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.628729 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.643697 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.666188 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.678432 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.691287 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.702968 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.712118 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.719126 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.724299 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.724342 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.724359 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.724381 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.724398 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:33Z","lastTransitionTime":"2026-03-20T16:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.727234 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.744630 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.754266 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.772054 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.785796 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.797784 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.809095 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.815370 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.827637 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.827738 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.827761 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.827791 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.827813 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:33Z","lastTransitionTime":"2026-03-20T16:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.932154 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.932234 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.932254 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.932285 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:33 crc kubenswrapper[4708]: I0320 16:02:33.932306 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:33Z","lastTransitionTime":"2026-03-20T16:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.036349 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.036423 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.036443 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.036473 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.036495 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.110317 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:34 crc kubenswrapper[4708]: E0320 16:02:34.110552 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.140388 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.140691 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.140760 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.140823 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.140879 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.245971 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.246541 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.246561 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.246586 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.246607 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.348877 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.348943 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.348957 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.348975 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.348987 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.379897 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.379928 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.379993 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.380013 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.380023 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: E0320 16:02:34.388614 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.391607 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.391630 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.391638 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.391654 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.391663 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: E0320 16:02:34.399599 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.402781 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.402819 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.402830 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.402993 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.403013 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: E0320 16:02:34.411291 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.415222 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.415263 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.415276 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.415295 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.415306 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: E0320 16:02:34.424395 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.427804 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.427910 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.427986 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.428084 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.428179 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.437412 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb"} Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.437456 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577"} Mar 20 16:02:34 crc kubenswrapper[4708]: E0320 16:02:34.438308 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: E0320 16:02:34.438614 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.448929 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.451799 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.451895 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.451961 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.452022 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.452091 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.458184 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.466050 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.476132 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.484395 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.495178 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.506061 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.520341 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.528632 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.535087 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.544141 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.554351 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.554404 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.554416 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.554435 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.554448 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.560071 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.567189 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.577602 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.587969 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.657298 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.657348 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.657358 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.657374 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.657385 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.760305 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.760441 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.760462 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.760477 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.760486 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.863087 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.863151 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.863166 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.863185 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.863198 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.966400 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.966476 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.966495 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.966519 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:34 crc kubenswrapper[4708]: I0320 16:02:34.966535 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:34Z","lastTransitionTime":"2026-03-20T16:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.069040 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.069086 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.069100 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.069120 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.069133 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:35Z","lastTransitionTime":"2026-03-20T16:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.110497 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.110608 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:35 crc kubenswrapper[4708]: E0320 16:02:35.110625 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:35 crc kubenswrapper[4708]: E0320 16:02:35.110825 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.172060 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.172108 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.172118 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.172134 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.172144 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:35Z","lastTransitionTime":"2026-03-20T16:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.275805 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.275856 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.275870 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.275888 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.275899 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:35Z","lastTransitionTime":"2026-03-20T16:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.378279 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.378323 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.378333 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.378349 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.378362 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:35Z","lastTransitionTime":"2026-03-20T16:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.445653 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604"} Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.462983 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.480310 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.480356 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.480369 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.480386 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.480398 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:35Z","lastTransitionTime":"2026-03-20T16:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.490658 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.511483 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.583337 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.583761 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.583796 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.583818 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.583836 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.583847 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:35Z","lastTransitionTime":"2026-03-20T16:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.596855 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.615402 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.627885 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.648272 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.671447 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.686399 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.686455 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.686586 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.686601 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.686619 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.686630 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:35Z","lastTransitionTime":"2026-03-20T16:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.698163 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.711269 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.721928 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.739978 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.759978 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:35Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.789359 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.789406 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.789415 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.789432 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.789443 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:35Z","lastTransitionTime":"2026-03-20T16:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.892207 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.892255 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.892264 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.892280 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.892291 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:35Z","lastTransitionTime":"2026-03-20T16:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.994522 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.994565 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.994575 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.994592 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:35 crc kubenswrapper[4708]: I0320 16:02:35.994603 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:35Z","lastTransitionTime":"2026-03-20T16:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.097494 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.097540 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.097549 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.097564 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.097576 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:36Z","lastTransitionTime":"2026-03-20T16:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.110195 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:36 crc kubenswrapper[4708]: E0320 16:02:36.110782 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.132720 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.147072 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.168467 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.183533 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.199199 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.200190 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.200236 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.200249 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.200266 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.200279 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:36Z","lastTransitionTime":"2026-03-20T16:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.213180 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.233154 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.249282 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.265497 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.279456 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.293979 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.304005 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.304049 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.304059 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.304077 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.304090 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:36Z","lastTransitionTime":"2026-03-20T16:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.311958 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.336264 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.354159 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.372172 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.408157 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.408205 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.408218 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.408240 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.408253 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:36Z","lastTransitionTime":"2026-03-20T16:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.450521 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0"} Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.452820 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/0.log" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.458929 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a" exitCode=1 Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.458967 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a"} Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.460336 4708 scope.go:117] "RemoveContainer" containerID="368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.472918 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.488658 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.502895 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.511899 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.511951 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.511962 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.511978 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.511991 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:36Z","lastTransitionTime":"2026-03-20T16:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.520201 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.551279 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.569906 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.586865 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.605160 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.615158 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.615249 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.615264 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.615290 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.615304 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:36Z","lastTransitionTime":"2026-03-20T16:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.623309 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.642094 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.661845 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.679161 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.700048 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.717561 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.719652 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.719699 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.719711 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.719727 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.719740 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:36Z","lastTransitionTime":"2026-03-20T16:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.733749 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.747602 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.779325 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.793108 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.810715 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.822150 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.822190 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.822200 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.822216 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.822226 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:36Z","lastTransitionTime":"2026-03-20T16:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.828973 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.847966 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.862141 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.877261 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.903054 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:02:35.753546 6521 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:02:35.754803 6521 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 16:02:35.754829 6521 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 16:02:35.754851 6521 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 16:02:35.754860 6521 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 16:02:35.754865 6521 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:02:35.754871 6521 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 16:02:35.754879 6521 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 16:02:35.754890 6521 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 16:02:35.754895 6521 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 16:02:35.754901 6521 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 16:02:35.754919 6521 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 16:02:35.754935 6521 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 16:02:35.754926 6521 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.916550 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.924944 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.925013 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.925028 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.925048 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.925063 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:36Z","lastTransitionTime":"2026-03-20T16:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.930529 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.954699 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:36 crc kubenswrapper[4708]: I0320 16:02:36.983840 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.004537 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.018811 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.027293 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.027338 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.027348 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.027364 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.027373 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.110429 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.110467 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:37 crc kubenswrapper[4708]: E0320 16:02:37.110596 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:37 crc kubenswrapper[4708]: E0320 16:02:37.110708 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.130417 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.130490 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.130502 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.130526 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.130544 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.233379 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.233435 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.233448 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.233468 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.233481 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.336375 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.336413 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.336422 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.336440 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.336480 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.439466 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.439528 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.439541 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.439559 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.439570 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.465224 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/0.log" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.467930 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28"} Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.468431 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.505206 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.524554 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.543340 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.543414 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.543433 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.543460 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.543494 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.547518 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.562023 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.576160 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.586496 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.597910 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.617181 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:02:35.753546 6521 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:02:35.754803 6521 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 16:02:35.754829 6521 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 16:02:35.754851 6521 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 16:02:35.754860 6521 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 16:02:35.754865 6521 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:02:35.754871 6521 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 16:02:35.754879 6521 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 16:02:35.754890 6521 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 16:02:35.754895 6521 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 16:02:35.754901 6521 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 16:02:35.754919 6521 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 16:02:35.754935 6521 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 16:02:35.754926 6521 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.626204 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.637786 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.646092 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.646152 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.646165 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.646187 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.646202 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.652612 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.664714 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.678179 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.692064 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.709288 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.749254 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.749292 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.749301 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.749318 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.749328 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.851094 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.851148 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.851157 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.851171 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.851179 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.894064 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6"] Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.894559 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.897333 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.897660 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.905525 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a798e17b-98de-4215-abe5-82adf76e66ab-env-overrides\") pod \"ovnkube-control-plane-749d76644c-68gt6\" (UID: \"a798e17b-98de-4215-abe5-82adf76e66ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.905566 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lwld\" (UniqueName: \"kubernetes.io/projected/a798e17b-98de-4215-abe5-82adf76e66ab-kube-api-access-2lwld\") pod \"ovnkube-control-plane-749d76644c-68gt6\" (UID: \"a798e17b-98de-4215-abe5-82adf76e66ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.905589 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a798e17b-98de-4215-abe5-82adf76e66ab-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-68gt6\" (UID: \"a798e17b-98de-4215-abe5-82adf76e66ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.905617 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a798e17b-98de-4215-abe5-82adf76e66ab-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-68gt6\" (UID: \"a798e17b-98de-4215-abe5-82adf76e66ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.909365 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.927381 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.949871 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.953732 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.953766 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.953774 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.953789 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.953798 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:37Z","lastTransitionTime":"2026-03-20T16:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.966060 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:37 crc kubenswrapper[4708]: I0320 16:02:37.985869 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.000566 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.006456 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.006548 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lwld\" (UniqueName: \"kubernetes.io/projected/a798e17b-98de-4215-abe5-82adf76e66ab-kube-api-access-2lwld\") pod \"ovnkube-control-plane-749d76644c-68gt6\" (UID: \"a798e17b-98de-4215-abe5-82adf76e66ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.006572 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a798e17b-98de-4215-abe5-82adf76e66ab-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-68gt6\" (UID: \"a798e17b-98de-4215-abe5-82adf76e66ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.006594 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.006615 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a798e17b-98de-4215-abe5-82adf76e66ab-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-68gt6\" (UID: \"a798e17b-98de-4215-abe5-82adf76e66ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.006637 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.006712 4708 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.006722 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:03:10.006690069 +0000 UTC m=+144.681026794 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.006761 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:10.006750291 +0000 UTC m=+144.681087096 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.006792 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.006826 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.006848 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a798e17b-98de-4215-abe5-82adf76e66ab-env-overrides\") pod \"ovnkube-control-plane-749d76644c-68gt6\" (UID: \"a798e17b-98de-4215-abe5-82adf76e66ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.006898 4708 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.006896 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.006936 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:10.006921245 +0000 UTC m=+144.681258020 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.006945 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.006962 4708 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.006915 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.007021 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:10.006996907 +0000 UTC m=+144.681333622 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.007023 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.007046 4708 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.007082 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:10.007072429 +0000 UTC m=+144.681409224 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.007430 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a798e17b-98de-4215-abe5-82adf76e66ab-env-overrides\") pod \"ovnkube-control-plane-749d76644c-68gt6\" (UID: \"a798e17b-98de-4215-abe5-82adf76e66ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.007864 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a798e17b-98de-4215-abe5-82adf76e66ab-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-68gt6\" (UID: \"a798e17b-98de-4215-abe5-82adf76e66ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.013622 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a798e17b-98de-4215-abe5-82adf76e66ab-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-68gt6\" (UID: \"a798e17b-98de-4215-abe5-82adf76e66ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.015506 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.023439 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lwld\" (UniqueName: \"kubernetes.io/projected/a798e17b-98de-4215-abe5-82adf76e66ab-kube-api-access-2lwld\") pod \"ovnkube-control-plane-749d76644c-68gt6\" (UID: \"a798e17b-98de-4215-abe5-82adf76e66ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.043371 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.055114 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.056528 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.056571 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.056582 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.056601 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.056612 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:38Z","lastTransitionTime":"2026-03-20T16:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.066955 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.084237 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:02:35.753546 6521 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:02:35.754803 6521 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 16:02:35.754829 6521 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 16:02:35.754851 6521 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 16:02:35.754860 6521 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 16:02:35.754865 6521 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:02:35.754871 6521 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 16:02:35.754879 6521 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 16:02:35.754890 6521 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 16:02:35.754895 6521 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 16:02:35.754901 6521 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 16:02:35.754919 6521 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 16:02:35.754935 6521 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 16:02:35.754926 6521 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.093379 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.104411 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.112480 4708 scope.go:117] "RemoveContainer" containerID="8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.112766 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.112820 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.116074 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.126414 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.135236 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.158700 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.158734 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.158744 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.158762 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.158772 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:38Z","lastTransitionTime":"2026-03-20T16:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.207302 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" Mar 20 16:02:38 crc kubenswrapper[4708]: W0320 16:02:38.222185 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda798e17b_98de_4215_abe5_82adf76e66ab.slice/crio-7a4c71710367ad53da36cab5d3ba9f32bd79418645a26b0a22b60bf437c05d9d WatchSource:0}: Error finding container 7a4c71710367ad53da36cab5d3ba9f32bd79418645a26b0a22b60bf437c05d9d: Status 404 returned error can't find the container with id 7a4c71710367ad53da36cab5d3ba9f32bd79418645a26b0a22b60bf437c05d9d Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.261606 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.261646 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.261657 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.261691 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.261705 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:38Z","lastTransitionTime":"2026-03-20T16:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.364976 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.365005 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.365015 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.365031 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.365040 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:38Z","lastTransitionTime":"2026-03-20T16:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.469203 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.469279 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.469295 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.469312 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.469322 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:38Z","lastTransitionTime":"2026-03-20T16:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.473974 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" event={"ID":"a798e17b-98de-4215-abe5-82adf76e66ab","Type":"ContainerStarted","Data":"fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.474017 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" event={"ID":"a798e17b-98de-4215-abe5-82adf76e66ab","Type":"ContainerStarted","Data":"7a4c71710367ad53da36cab5d3ba9f32bd79418645a26b0a22b60bf437c05d9d"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.476943 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.478307 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.478596 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.479892 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/1.log" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.480439 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/0.log" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.483052 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28" exitCode=1 Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.483092 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.483126 4708 scope.go:117] "RemoveContainer" containerID="368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.483633 4708 scope.go:117] "RemoveContainer" containerID="b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28" Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.483794 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.496609 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.512537 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.536922 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.548752 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.562377 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.571857 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.571915 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.571925 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.571942 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.571951 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:38Z","lastTransitionTime":"2026-03-20T16:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.582925 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:02:35.753546 6521 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:02:35.754803 6521 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 16:02:35.754829 6521 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 16:02:35.754851 6521 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 16:02:35.754860 6521 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 16:02:35.754865 6521 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:02:35.754871 6521 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 16:02:35.754879 6521 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 16:02:35.754890 6521 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 16:02:35.754895 6521 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 16:02:35.754901 6521 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 16:02:35.754919 6521 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 16:02:35.754935 6521 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 16:02:35.754926 6521 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.593685 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.609911 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.622952 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.626972 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-gtlzm"] Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.627717 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.627788 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.637137 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.647592 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.659235 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.674313 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.674357 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.674369 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.674389 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.674401 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:38Z","lastTransitionTime":"2026-03-20T16:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.677589 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.692571 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.710978 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.713407 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnkl7\" (UniqueName: \"kubernetes.io/projected/3574461f-8c2b-446b-a2f1-c1be3a8d7824-kube-api-access-tnkl7\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.713476 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.728348 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.744805 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.761013 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.776990 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.777042 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.777054 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.777073 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.777087 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:38Z","lastTransitionTime":"2026-03-20T16:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.785237 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.796421 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.808635 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.814901 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.815139 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnkl7\" (UniqueName: \"kubernetes.io/projected/3574461f-8c2b-446b-a2f1-c1be3a8d7824-kube-api-access-tnkl7\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.815075 4708 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:02:38 crc kubenswrapper[4708]: E0320 16:02:38.815365 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs podName:3574461f-8c2b-446b-a2f1-c1be3a8d7824 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:39.315350114 +0000 UTC m=+113.989686829 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs") pod "network-metrics-daemon-gtlzm" (UID: "3574461f-8c2b-446b-a2f1-c1be3a8d7824") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.819546 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.830631 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnkl7\" (UniqueName: \"kubernetes.io/projected/3574461f-8c2b-446b-a2f1-c1be3a8d7824-kube-api-access-tnkl7\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.836006 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.851471 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.865258 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.877440 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.879705 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.879875 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.880016 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.880115 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.880204 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:38Z","lastTransitionTime":"2026-03-20T16:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.906216 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368b910b476e51bf4cd8c0006ac47729cad09b80f6700c3bfb56c7f843cd7e3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:02:35.753546 6521 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 16:02:35.754803 6521 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 16:02:35.754829 6521 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 16:02:35.754851 6521 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 16:02:35.754860 6521 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 16:02:35.754865 6521 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:02:35.754871 6521 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 16:02:35.754879 6521 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 16:02:35.754890 6521 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 16:02:35.754895 6521 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 16:02:35.754901 6521 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 16:02:35.754919 6521 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 16:02:35.754935 6521 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 16:02:35.754926 6521 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 16:02:37.560047 6691 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:37.560059 66\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.917952 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.931553 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.945977 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.961593 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.974516 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.983206 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.983243 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.983255 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.983275 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.983289 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:38Z","lastTransitionTime":"2026-03-20T16:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:38 crc kubenswrapper[4708]: I0320 16:02:38.991605 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:38Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.086680 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.086997 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.087073 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.087140 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.087239 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.110397 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.110483 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:39 crc kubenswrapper[4708]: E0320 16:02:39.110569 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:39 crc kubenswrapper[4708]: E0320 16:02:39.111088 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.189962 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.190062 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.190085 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.190117 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.190138 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.292748 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.292797 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.292808 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.292824 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.292838 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.325870 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:39 crc kubenswrapper[4708]: E0320 16:02:39.326039 4708 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:02:39 crc kubenswrapper[4708]: E0320 16:02:39.326110 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs podName:3574461f-8c2b-446b-a2f1-c1be3a8d7824 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:40.326094061 +0000 UTC m=+115.000430776 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs") pod "network-metrics-daemon-gtlzm" (UID: "3574461f-8c2b-446b-a2f1-c1be3a8d7824") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.395440 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.395483 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.395493 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.395508 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.395519 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.488346 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/1.log" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.492419 4708 scope.go:117] "RemoveContainer" containerID="b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28" Mar 20 16:02:39 crc kubenswrapper[4708]: E0320 16:02:39.492601 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.493210 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" event={"ID":"a798e17b-98de-4215-abe5-82adf76e66ab","Type":"ContainerStarted","Data":"c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9"} Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.500536 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.500568 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.500577 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.500590 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.500600 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.510883 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.522373 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.543291 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.558810 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.573309 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.585305 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.602785 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.602845 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.602859 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.602881 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.602894 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.606730 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 16:02:37.560047 6691 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:37.560059 66\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.617942 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.629475 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.644310 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.660746 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.673245 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.688786 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.702017 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.706769 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.706818 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.706831 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.706851 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.706864 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.719099 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.733600 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.746313 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.766499 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.780660 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.795126 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.808494 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.810224 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.810279 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.810293 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.810313 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.810328 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.825814 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.839521 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.852510 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.865574 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.883838 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 16:02:37.560047 6691 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:37.560059 66\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.896461 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.908042 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.912393 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.912443 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.912456 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.912475 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.912488 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:39Z","lastTransitionTime":"2026-03-20T16:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.923161 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.937153 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.950566 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.965222 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.980556 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:39 crc kubenswrapper[4708]: I0320 16:02:39.994441 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:39Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.015129 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.015175 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.015209 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.015229 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.015243 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.110849 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.110888 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:40 crc kubenswrapper[4708]: E0320 16:02:40.111010 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:02:40 crc kubenswrapper[4708]: E0320 16:02:40.111119 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.119114 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.119160 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.119173 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.119188 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.119205 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.221638 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.221687 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.221707 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.221722 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.221730 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.324204 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.324292 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.324305 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.324321 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.324332 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.338146 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:40 crc kubenswrapper[4708]: E0320 16:02:40.338389 4708 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:02:40 crc kubenswrapper[4708]: E0320 16:02:40.338509 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs podName:3574461f-8c2b-446b-a2f1-c1be3a8d7824 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:42.338482142 +0000 UTC m=+117.012818877 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs") pod "network-metrics-daemon-gtlzm" (UID: "3574461f-8c2b-446b-a2f1-c1be3a8d7824") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.426826 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.426892 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.426909 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.426933 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.426950 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.530083 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.530136 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.530154 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.530173 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.530185 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.632956 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.632990 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.632998 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.633014 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.633024 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.735547 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.735634 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.735651 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.735701 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.735717 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.838129 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.838197 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.838207 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.838240 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.838253 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.940606 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.940646 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.940661 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.940707 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:40 crc kubenswrapper[4708]: I0320 16:02:40.940720 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:40Z","lastTransitionTime":"2026-03-20T16:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.043363 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.043401 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.043409 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.043423 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.043433 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.110197 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.110197 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:41 crc kubenswrapper[4708]: E0320 16:02:41.110477 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:41 crc kubenswrapper[4708]: E0320 16:02:41.110574 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.147057 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.147102 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.147111 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.147128 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.147142 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.249722 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.249776 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.249792 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.249813 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.249830 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.352217 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.352268 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.352280 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.352298 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.352310 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.455187 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.455242 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.455254 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.455272 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.455285 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.559140 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.559222 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.559258 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.559284 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.559303 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.662175 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.662276 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.662304 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.662330 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.662349 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.764971 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.765044 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.765063 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.765091 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.765109 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.867579 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.867652 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.867704 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.867735 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.867757 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.970608 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.970699 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.970725 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.970759 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:41 crc kubenswrapper[4708]: I0320 16:02:41.970788 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:41Z","lastTransitionTime":"2026-03-20T16:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.073260 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.073308 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.073322 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.073341 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.073354 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.111101 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.111220 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:42 crc kubenswrapper[4708]: E0320 16:02:42.111373 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:02:42 crc kubenswrapper[4708]: E0320 16:02:42.111460 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.175891 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.175966 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.175989 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.176019 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.176043 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.278495 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.278536 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.278547 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.278565 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.278579 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.360184 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:42 crc kubenswrapper[4708]: E0320 16:02:42.360419 4708 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:02:42 crc kubenswrapper[4708]: E0320 16:02:42.360546 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs podName:3574461f-8c2b-446b-a2f1-c1be3a8d7824 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:46.360511831 +0000 UTC m=+121.034848576 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs") pod "network-metrics-daemon-gtlzm" (UID: "3574461f-8c2b-446b-a2f1-c1be3a8d7824") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.381284 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.381328 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.381339 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.381355 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.381364 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.484373 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.484411 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.484420 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.484446 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.484455 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.587406 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.587456 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.587469 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.587487 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.587499 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.689839 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.689896 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.689917 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.689946 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.689968 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.792462 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.792509 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.792520 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.792538 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.792549 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.895197 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.895255 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.895272 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.895297 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.895316 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.998067 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.998397 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.998461 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.998530 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:42 crc kubenswrapper[4708]: I0320 16:02:42.998597 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:42Z","lastTransitionTime":"2026-03-20T16:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.102007 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.102054 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.102066 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.102086 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.102100 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.110564 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.110724 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:43 crc kubenswrapper[4708]: E0320 16:02:43.110906 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:43 crc kubenswrapper[4708]: E0320 16:02:43.111195 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.204940 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.204998 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.205014 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.205033 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.205046 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.307500 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.307894 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.308034 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.308171 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.308297 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.411269 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.411548 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.411751 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.411822 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.411877 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.513966 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.514033 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.514048 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.514064 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.514077 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.616868 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.616908 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.616917 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.616949 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.616958 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.720485 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.720562 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.720581 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.720607 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.720628 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.823227 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.823310 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.823333 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.823363 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.823384 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.926376 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.926422 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.926432 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.926449 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:43 crc kubenswrapper[4708]: I0320 16:02:43.926459 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:43Z","lastTransitionTime":"2026-03-20T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.028597 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.028638 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.028649 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.028680 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.028690 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.110947 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.111077 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:44 crc kubenswrapper[4708]: E0320 16:02:44.111197 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:44 crc kubenswrapper[4708]: E0320 16:02:44.111272 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.131107 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.131146 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.131158 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.131174 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.131185 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.233977 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.234027 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.234038 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.234055 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.234067 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.336542 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.336587 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.336599 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.336621 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.336634 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.439212 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.439248 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.439258 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.439273 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.439283 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.465307 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.465358 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.465369 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.465384 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.465394 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: E0320 16:02:44.478328 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:44Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.482557 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.482610 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.482627 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.482648 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.482660 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: E0320 16:02:44.495401 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:44Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.499244 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.499281 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.499292 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.499305 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.499315 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: E0320 16:02:44.511734 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:44Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.515784 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.515817 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.515826 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.515842 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.515852 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: E0320 16:02:44.529099 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:44Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.532375 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.532410 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.532419 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.532435 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.532446 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: E0320 16:02:44.546085 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:44Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:44 crc kubenswrapper[4708]: E0320 16:02:44.546300 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.547812 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.547868 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.547882 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.547902 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.547913 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.650369 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.650410 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.650424 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.650441 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.650456 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.753385 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.753419 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.753428 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.753441 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.753450 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.855615 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.855661 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.855693 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.855711 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.855723 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.958558 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.958639 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.958705 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.958740 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:44 crc kubenswrapper[4708]: I0320 16:02:44.958760 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:44Z","lastTransitionTime":"2026-03-20T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.061642 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.061752 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.061770 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.061794 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.061812 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.110291 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.110320 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:45 crc kubenswrapper[4708]: E0320 16:02:45.110609 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:45 crc kubenswrapper[4708]: E0320 16:02:45.110726 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.164806 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.164852 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.164864 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.164883 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.164894 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.268492 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.268543 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.268557 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.268576 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.268589 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.372110 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.372198 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.372221 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.372252 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.372275 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.474130 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.474159 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.474169 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.474182 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.474191 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.577492 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.577533 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.577545 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.577564 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.577576 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.680224 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.680270 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.680283 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.682879 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.682896 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.786161 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.786200 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.786210 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.786226 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.786238 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.888633 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.888800 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.888813 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.888826 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.888837 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.991648 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.991706 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.991715 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.991730 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:45 crc kubenswrapper[4708]: I0320 16:02:45.991739 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:45Z","lastTransitionTime":"2026-03-20T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:46 crc kubenswrapper[4708]: E0320 16:02:46.092558 4708 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.109972 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:46 crc kubenswrapper[4708]: E0320 16:02:46.110107 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.110306 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:46 crc kubenswrapper[4708]: E0320 16:02:46.110614 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.149089 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.169381 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.186783 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.201878 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.214616 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.224138 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.235689 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: E0320 16:02:46.243056 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.256298 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 16:02:37.560047 6691 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:37.560059 66\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.265973 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.275209 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.289652 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.306071 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.320627 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.332510 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.343828 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.360023 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.377622 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:46 crc kubenswrapper[4708]: I0320 16:02:46.406248 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:46 crc kubenswrapper[4708]: E0320 16:02:46.406426 4708 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:02:46 crc kubenswrapper[4708]: E0320 16:02:46.406552 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs podName:3574461f-8c2b-446b-a2f1-c1be3a8d7824 nodeName:}" failed. No retries permitted until 2026-03-20 16:02:54.406527002 +0000 UTC m=+129.080863727 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs") pod "network-metrics-daemon-gtlzm" (UID: "3574461f-8c2b-446b-a2f1-c1be3a8d7824") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:02:47 crc kubenswrapper[4708]: I0320 16:02:47.111051 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:47 crc kubenswrapper[4708]: I0320 16:02:47.111051 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:47 crc kubenswrapper[4708]: E0320 16:02:47.111369 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:47 crc kubenswrapper[4708]: E0320 16:02:47.111240 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:48 crc kubenswrapper[4708]: I0320 16:02:48.110141 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:48 crc kubenswrapper[4708]: I0320 16:02:48.110224 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:48 crc kubenswrapper[4708]: E0320 16:02:48.110353 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:02:48 crc kubenswrapper[4708]: E0320 16:02:48.110511 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:49 crc kubenswrapper[4708]: I0320 16:02:49.110166 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:49 crc kubenswrapper[4708]: E0320 16:02:49.110347 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:49 crc kubenswrapper[4708]: I0320 16:02:49.110585 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:49 crc kubenswrapper[4708]: E0320 16:02:49.110648 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:50 crc kubenswrapper[4708]: I0320 16:02:50.111002 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:50 crc kubenswrapper[4708]: I0320 16:02:50.111204 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:50 crc kubenswrapper[4708]: E0320 16:02:50.111324 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:02:50 crc kubenswrapper[4708]: E0320 16:02:50.111440 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:51 crc kubenswrapper[4708]: I0320 16:02:51.110419 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:51 crc kubenswrapper[4708]: I0320 16:02:51.110433 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:51 crc kubenswrapper[4708]: E0320 16:02:51.110665 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:51 crc kubenswrapper[4708]: E0320 16:02:51.110780 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:51 crc kubenswrapper[4708]: E0320 16:02:51.245315 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:02:52 crc kubenswrapper[4708]: I0320 16:02:52.110165 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:52 crc kubenswrapper[4708]: I0320 16:02:52.110201 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:52 crc kubenswrapper[4708]: E0320 16:02:52.110453 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:52 crc kubenswrapper[4708]: E0320 16:02:52.110666 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.110101 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.110160 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:53 crc kubenswrapper[4708]: E0320 16:02:53.110700 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:53 crc kubenswrapper[4708]: E0320 16:02:53.110847 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.111334 4708 scope.go:117] "RemoveContainer" containerID="b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.551347 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/1.log" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.554911 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2"} Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.555572 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.575933 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.600599 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.620832 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.635417 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.653065 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.670185 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.686259 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.700303 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.715909 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.736972 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.752928 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.765131 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.778553 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.797430 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 16:02:37.560047 6691 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:37.560059 66\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.806496 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.817040 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:53 crc kubenswrapper[4708]: I0320 16:02:53.832412 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:53Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.110851 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.110946 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:54 crc kubenswrapper[4708]: E0320 16:02:54.111033 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:54 crc kubenswrapper[4708]: E0320 16:02:54.111110 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.502723 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:54 crc kubenswrapper[4708]: E0320 16:02:54.502912 4708 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:02:54 crc kubenswrapper[4708]: E0320 16:02:54.503033 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs podName:3574461f-8c2b-446b-a2f1-c1be3a8d7824 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:10.503004692 +0000 UTC m=+145.177341437 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs") pod "network-metrics-daemon-gtlzm" (UID: "3574461f-8c2b-446b-a2f1-c1be3a8d7824") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.561329 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/2.log" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.562194 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/1.log" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.565756 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2" exitCode=1 Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.565872 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2"} Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.565959 4708 scope.go:117] "RemoveContainer" containerID="b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.566755 4708 scope.go:117] "RemoveContainer" containerID="36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2" Mar 20 16:02:54 crc kubenswrapper[4708]: E0320 16:02:54.566978 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.583430 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.598294 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.622071 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b31f8fc4124c5a285059a3f147adec0c1b66b37acba1602911e1d2efe0df6a28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 16:02:37.560047 6691 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:37Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:37.560059 66\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"t-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 16:02:54.133730 6975 services_controller.go:445] Built service openshift-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 16:02:54.133755 6975 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:54.133762 6975 services_controller.go:451] Built service openshift-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protoc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.634449 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.647167 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.660579 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.674481 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.690127 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.704647 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.720068 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.738314 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.752827 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.767517 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.782237 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.805614 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.824431 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.824531 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.824548 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.824570 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.824583 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.836955 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: E0320 16:02:54.848577 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.854241 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.854280 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.854291 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.854312 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.854324 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.860649 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: E0320 16:02:54.871960 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.876534 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.876588 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.876606 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.876630 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.876650 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4708]: E0320 16:02:54.899000 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.903805 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.903873 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.903898 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.903929 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.903951 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4708]: E0320 16:02:54.925883 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.930946 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.930999 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.931014 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.931034 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:02:54 crc kubenswrapper[4708]: I0320 16:02:54.931050 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:02:54Z","lastTransitionTime":"2026-03-20T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:02:54 crc kubenswrapper[4708]: E0320 16:02:54.947422 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:54 crc kubenswrapper[4708]: E0320 16:02:54.948387 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.111118 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.111122 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:55 crc kubenswrapper[4708]: E0320 16:02:55.111304 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:55 crc kubenswrapper[4708]: E0320 16:02:55.111780 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.128661 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.571358 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/2.log" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.575526 4708 scope.go:117] "RemoveContainer" containerID="36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2" Mar 20 16:02:55 crc kubenswrapper[4708]: E0320 16:02:55.575878 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.590301 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.603719 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.623331 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.644485 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"t-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 16:02:54.133730 6975 services_controller.go:445] Built service openshift-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 16:02:54.133755 6975 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:54.133762 6975 services_controller.go:451] Built service openshift-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protoc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.655042 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.669434 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.685305 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.708342 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.725338 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.744890 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.762612 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.781040 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.804729 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.827900 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.841808 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.863615 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.889014 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:55 crc kubenswrapper[4708]: I0320 16:02:55.926385 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:55Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.110052 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.110069 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:56 crc kubenswrapper[4708]: E0320 16:02:56.110238 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:02:56 crc kubenswrapper[4708]: E0320 16:02:56.110392 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.126408 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.168079 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.184088 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.200318 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.211961 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.225107 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.237585 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: E0320 16:02:56.246459 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.268192 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.284420 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.299209 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.312770 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.328725 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.341002 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.353499 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.371870 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"t-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 16:02:54.133730 6975 services_controller.go:445] Built service openshift-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 16:02:54.133755 6975 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:54.133762 6975 services_controller.go:451] Built service openshift-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protoc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.384187 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.397806 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.414971 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.892442 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.908627 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.931638 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.953935 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.973961 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:56 crc kubenswrapper[4708]: I0320 16:02:56.990881 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:56Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.004657 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.016584 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.035932 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.049187 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.070544 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.085830 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.101503 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.110926 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.110949 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:57 crc kubenswrapper[4708]: E0320 16:02:57.111101 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:57 crc kubenswrapper[4708]: E0320 16:02:57.111210 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.116663 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.131736 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.150638 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.184103 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"t-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 16:02:54.133730 6975 services_controller.go:445] Built service openshift-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 16:02:54.133755 6975 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:54.133762 6975 services_controller.go:451] Built service openshift-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protoc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.205376 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:57 crc kubenswrapper[4708]: I0320 16:02:57.229964 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:57Z is after 2025-08-24T17:21:41Z" Mar 20 16:02:58 crc kubenswrapper[4708]: I0320 16:02:58.111146 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:02:58 crc kubenswrapper[4708]: I0320 16:02:58.111173 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:02:58 crc kubenswrapper[4708]: E0320 16:02:58.111476 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:02:58 crc kubenswrapper[4708]: E0320 16:02:58.111559 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:02:59 crc kubenswrapper[4708]: I0320 16:02:59.110573 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:02:59 crc kubenswrapper[4708]: I0320 16:02:59.110593 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:02:59 crc kubenswrapper[4708]: E0320 16:02:59.110869 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:02:59 crc kubenswrapper[4708]: E0320 16:02:59.111025 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:00 crc kubenswrapper[4708]: I0320 16:03:00.111330 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:00 crc kubenswrapper[4708]: I0320 16:03:00.111331 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:00 crc kubenswrapper[4708]: E0320 16:03:00.111570 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:00 crc kubenswrapper[4708]: E0320 16:03:00.111843 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:01 crc kubenswrapper[4708]: I0320 16:03:01.110558 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:01 crc kubenswrapper[4708]: I0320 16:03:01.110725 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:01 crc kubenswrapper[4708]: E0320 16:03:01.111157 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:01 crc kubenswrapper[4708]: E0320 16:03:01.111372 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:01 crc kubenswrapper[4708]: E0320 16:03:01.248549 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:02 crc kubenswrapper[4708]: I0320 16:03:02.110484 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:02 crc kubenswrapper[4708]: I0320 16:03:02.110490 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:02 crc kubenswrapper[4708]: E0320 16:03:02.110803 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:02 crc kubenswrapper[4708]: E0320 16:03:02.111003 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:03 crc kubenswrapper[4708]: I0320 16:03:03.110250 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:03 crc kubenswrapper[4708]: I0320 16:03:03.110490 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:03 crc kubenswrapper[4708]: E0320 16:03:03.111145 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:03 crc kubenswrapper[4708]: E0320 16:03:03.111277 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:04 crc kubenswrapper[4708]: I0320 16:03:04.110113 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:04 crc kubenswrapper[4708]: E0320 16:03:04.110378 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:04 crc kubenswrapper[4708]: I0320 16:03:04.110432 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:04 crc kubenswrapper[4708]: E0320 16:03:04.110987 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.062372 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.062432 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.062452 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.062477 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.062495 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4708]: E0320 16:03:05.082051 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.087559 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.087601 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.087611 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.087631 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.087643 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4708]: E0320 16:03:05.104370 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.109420 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.109475 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.109485 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.109521 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.109534 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.110259 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.110262 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:05 crc kubenswrapper[4708]: E0320 16:03:05.110830 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:05 crc kubenswrapper[4708]: E0320 16:03:05.111470 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.129077 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 16:03:05 crc kubenswrapper[4708]: E0320 16:03:05.129209 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.134512 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.134582 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.134600 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.134628 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.134649 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4708]: E0320 16:03:05.153515 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.159278 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.159335 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.159357 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.159387 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:05 crc kubenswrapper[4708]: I0320 16:03:05.159405 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:05Z","lastTransitionTime":"2026-03-20T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:05 crc kubenswrapper[4708]: E0320 16:03:05.180285 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:05Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:05 crc kubenswrapper[4708]: E0320 16:03:05.180485 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.110153 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.110189 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:06 crc kubenswrapper[4708]: E0320 16:03:06.110319 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:06 crc kubenswrapper[4708]: E0320 16:03:06.110656 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.126033 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.146035 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.163188 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.182658 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.196882 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.211704 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.235953 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"t-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 16:02:54.133730 6975 services_controller.go:445] Built service openshift-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 16:02:54.133755 6975 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:54.133762 6975 services_controller.go:451] Built service openshift-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protoc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: E0320 16:03:06.249615 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.250882 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.269721 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.290301 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.303888 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.320387 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.334489 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.347476 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.359117 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.371027 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a995f8b-228b-4ae2-925c-41f31924d374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce00a794483a44bc8052d04362f63fd12b2abe8577d8f276aadb4609c40709f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67a83c494b01d6dcf1c0ff7f564cf91d62c6e0f19fc7d4301dc4cd3f5da38b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf231961d33f62280cb7698c00ad3326849806ddc1d5bf010cb11f6cf5a27c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.390585 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.407603 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:06 crc kubenswrapper[4708]: I0320 16:03:06.425248 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:06Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:07 crc kubenswrapper[4708]: I0320 16:03:07.111121 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:07 crc kubenswrapper[4708]: I0320 16:03:07.111238 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:07 crc kubenswrapper[4708]: E0320 16:03:07.111318 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:07 crc kubenswrapper[4708]: E0320 16:03:07.111426 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:08 crc kubenswrapper[4708]: I0320 16:03:08.110476 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:08 crc kubenswrapper[4708]: E0320 16:03:08.110658 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:08 crc kubenswrapper[4708]: I0320 16:03:08.110746 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:08 crc kubenswrapper[4708]: E0320 16:03:08.110876 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:08 crc kubenswrapper[4708]: I0320 16:03:08.112041 4708 scope.go:117] "RemoveContainer" containerID="36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2" Mar 20 16:03:08 crc kubenswrapper[4708]: E0320 16:03:08.112359 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" Mar 20 16:03:09 crc kubenswrapper[4708]: I0320 16:03:09.111004 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:09 crc kubenswrapper[4708]: I0320 16:03:09.111004 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:09 crc kubenswrapper[4708]: E0320 16:03:09.111217 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:09 crc kubenswrapper[4708]: E0320 16:03:09.111359 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:10 crc kubenswrapper[4708]: I0320 16:03:10.079041 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:03:10 crc kubenswrapper[4708]: I0320 16:03:10.079146 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079189 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:14.079168209 +0000 UTC m=+208.753504934 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:03:10 crc kubenswrapper[4708]: I0320 16:03:10.079219 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:10 crc kubenswrapper[4708]: I0320 16:03:10.079257 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:10 crc kubenswrapper[4708]: I0320 16:03:10.079285 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079329 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079370 4708 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079336 4708 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079403 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079424 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079373 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079457 4708 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079410 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:14.079401437 +0000 UTC m=+208.753738152 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079438 4708 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079537 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:14.0795143 +0000 UTC m=+208.753851065 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079567 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:14.079550801 +0000 UTC m=+208.753887556 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.079607 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:14.079592412 +0000 UTC m=+208.753929167 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:03:10 crc kubenswrapper[4708]: I0320 16:03:10.109955 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:10 crc kubenswrapper[4708]: I0320 16:03:10.110028 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.110118 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.110296 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:10 crc kubenswrapper[4708]: I0320 16:03:10.583611 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.583822 4708 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:10 crc kubenswrapper[4708]: E0320 16:03:10.583943 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs podName:3574461f-8c2b-446b-a2f1-c1be3a8d7824 nodeName:}" failed. No retries permitted until 2026-03-20 16:03:42.583921235 +0000 UTC m=+177.258258030 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs") pod "network-metrics-daemon-gtlzm" (UID: "3574461f-8c2b-446b-a2f1-c1be3a8d7824") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:11 crc kubenswrapper[4708]: I0320 16:03:11.110380 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:11 crc kubenswrapper[4708]: I0320 16:03:11.110380 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:11 crc kubenswrapper[4708]: E0320 16:03:11.110549 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:11 crc kubenswrapper[4708]: E0320 16:03:11.110867 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:11 crc kubenswrapper[4708]: E0320 16:03:11.250794 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:12 crc kubenswrapper[4708]: I0320 16:03:12.110357 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:12 crc kubenswrapper[4708]: E0320 16:03:12.110733 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:12 crc kubenswrapper[4708]: I0320 16:03:12.111462 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:12 crc kubenswrapper[4708]: E0320 16:03:12.111601 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.110199 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.110270 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:13 crc kubenswrapper[4708]: E0320 16:03:13.110351 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:13 crc kubenswrapper[4708]: E0320 16:03:13.110485 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.646347 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8kspl_f49a68df-98d0-464f-b40e-0aba2faab528/kube-multus/0.log" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.646459 4708 generic.go:334] "Generic (PLEG): container finished" podID="f49a68df-98d0-464f-b40e-0aba2faab528" containerID="527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537" exitCode=1 Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.646523 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8kspl" event={"ID":"f49a68df-98d0-464f-b40e-0aba2faab528","Type":"ContainerDied","Data":"527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537"} Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.647390 4708 scope.go:117] "RemoveContainer" containerID="527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.671560 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.694991 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.708071 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.733701 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.755028 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.770982 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.790841 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.813092 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.832033 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a995f8b-228b-4ae2-925c-41f31924d374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce00a794483a44bc8052d04362f63fd12b2abe8577d8f276aadb4609c40709f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67a83c494b01d6dcf1c0ff7f564cf91d62c6e0f19fc7d4301dc4cd3f5da38b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf231961d33f62280cb7698c00ad3326849806ddc1d5bf010cb11f6cf5a27c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.861464 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.875163 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"message\\\":\\\"2026-03-20T16:02:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392\\\\n2026-03-20T16:02:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392 to /host/opt/cni/bin/\\\\n2026-03-20T16:02:27Z [verbose] multus-daemon started\\\\n2026-03-20T16:02:27Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.889241 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.904640 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.937317 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"t-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 16:02:54.133730 6975 services_controller.go:445] Built service openshift-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 16:02:54.133755 6975 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:54.133762 6975 services_controller.go:451] Built service openshift-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protoc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.952450 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.964260 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.978900 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:13 crc kubenswrapper[4708]: I0320 16:03:13.994095 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:13Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.006378 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.110444 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:14 crc kubenswrapper[4708]: E0320 16:03:14.110836 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.111010 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:14 crc kubenswrapper[4708]: E0320 16:03:14.112301 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.651401 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8kspl_f49a68df-98d0-464f-b40e-0aba2faab528/kube-multus/0.log" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.651486 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8kspl" event={"ID":"f49a68df-98d0-464f-b40e-0aba2faab528","Type":"ContainerStarted","Data":"2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b"} Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.668140 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.687015 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.701601 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.721785 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.739096 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.755648 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.768509 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.786450 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.799388 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a995f8b-228b-4ae2-925c-41f31924d374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce00a794483a44bc8052d04362f63fd12b2abe8577d8f276aadb4609c40709f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67a83c494b01d6dcf1c0ff7f564cf91d62c6e0f19fc7d4301dc4cd3f5da38b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf231961d33f62280cb7698c00ad3326849806ddc1d5bf010cb11f6cf5a27c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.825523 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.845257 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"message\\\":\\\"2026-03-20T16:02:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392\\\\n2026-03-20T16:02:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392 to /host/opt/cni/bin/\\\\n2026-03-20T16:02:27Z [verbose] multus-daemon started\\\\n2026-03-20T16:02:27Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.861176 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.877045 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.898662 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"t-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 16:02:54.133730 6975 services_controller.go:445] Built service openshift-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 16:02:54.133755 6975 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:54.133762 6975 services_controller.go:451] Built service openshift-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protoc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.912549 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.927952 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.941869 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.955542 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:14 crc kubenswrapper[4708]: I0320 16:03:14.969247 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:14Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.110752 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.110752 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:15 crc kubenswrapper[4708]: E0320 16:03:15.110909 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:15 crc kubenswrapper[4708]: E0320 16:03:15.110981 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.553296 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.553356 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.553372 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.553395 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.553411 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4708]: E0320 16:03:15.573037 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.578133 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.578172 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.578186 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.578206 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.578220 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4708]: E0320 16:03:15.594612 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.602918 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.602997 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.603033 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.603053 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.603066 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4708]: E0320 16:03:15.621395 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.625563 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.625596 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.625605 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.625620 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.625629 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4708]: E0320 16:03:15.643283 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.646655 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.646730 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.646743 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.646785 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:15 crc kubenswrapper[4708]: I0320 16:03:15.646798 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:15Z","lastTransitionTime":"2026-03-20T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:15 crc kubenswrapper[4708]: E0320 16:03:15.665109 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:15Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:15 crc kubenswrapper[4708]: E0320 16:03:15.665273 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.110186 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.110229 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:16 crc kubenswrapper[4708]: E0320 16:03:16.110384 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:16 crc kubenswrapper[4708]: E0320 16:03:16.110490 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.136479 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.158711 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.179915 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.202977 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.217089 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.230962 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.250292 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: E0320 16:03:16.251503 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.280780 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.299381 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"message\\\":\\\"2026-03-20T16:02:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392\\\\n2026-03-20T16:02:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392 to /host/opt/cni/bin/\\\\n2026-03-20T16:02:27Z [verbose] multus-daemon started\\\\n2026-03-20T16:02:27Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.317256 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.330165 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a995f8b-228b-4ae2-925c-41f31924d374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce00a794483a44bc8052d04362f63fd12b2abe8577d8f276aadb4609c40709f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67a83c494b01d6dcf1c0ff7f564cf91d62c6e0f19fc7d4301dc4cd3f5da38b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf231961d33f62280cb7698c00ad3326849806ddc1d5bf010cb11f6cf5a27c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.344942 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.367860 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.387432 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.410707 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.442027 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"t-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 16:02:54.133730 6975 services_controller.go:445] Built service openshift-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 16:02:54.133755 6975 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:54.133762 6975 services_controller.go:451] Built service openshift-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protoc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.456120 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.467424 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:16 crc kubenswrapper[4708]: I0320 16:03:16.479955 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:16Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:17 crc kubenswrapper[4708]: I0320 16:03:17.110216 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:17 crc kubenswrapper[4708]: I0320 16:03:17.110375 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:17 crc kubenswrapper[4708]: E0320 16:03:17.110500 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:17 crc kubenswrapper[4708]: E0320 16:03:17.110810 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:18 crc kubenswrapper[4708]: I0320 16:03:18.110864 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:18 crc kubenswrapper[4708]: I0320 16:03:18.110903 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:18 crc kubenswrapper[4708]: E0320 16:03:18.111085 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:18 crc kubenswrapper[4708]: E0320 16:03:18.111150 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:19 crc kubenswrapper[4708]: I0320 16:03:19.110367 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:19 crc kubenswrapper[4708]: I0320 16:03:19.110367 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:19 crc kubenswrapper[4708]: E0320 16:03:19.110626 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:19 crc kubenswrapper[4708]: E0320 16:03:19.110764 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:20 crc kubenswrapper[4708]: I0320 16:03:20.110794 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:20 crc kubenswrapper[4708]: I0320 16:03:20.110823 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:20 crc kubenswrapper[4708]: E0320 16:03:20.111062 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:20 crc kubenswrapper[4708]: E0320 16:03:20.111224 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:21 crc kubenswrapper[4708]: I0320 16:03:21.110788 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:21 crc kubenswrapper[4708]: E0320 16:03:21.110921 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:21 crc kubenswrapper[4708]: I0320 16:03:21.110818 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:21 crc kubenswrapper[4708]: E0320 16:03:21.111079 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:21 crc kubenswrapper[4708]: E0320 16:03:21.252544 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:22 crc kubenswrapper[4708]: I0320 16:03:22.110869 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:22 crc kubenswrapper[4708]: I0320 16:03:22.110869 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:22 crc kubenswrapper[4708]: E0320 16:03:22.110999 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:22 crc kubenswrapper[4708]: E0320 16:03:22.111141 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.110315 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:23 crc kubenswrapper[4708]: E0320 16:03:23.110534 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.110702 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:23 crc kubenswrapper[4708]: E0320 16:03:23.111078 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.111368 4708 scope.go:117] "RemoveContainer" containerID="36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.751731 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/2.log" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.753223 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c"} Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.754369 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.772829 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.785987 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"message\\\":\\\"2026-03-20T16:02:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392\\\\n2026-03-20T16:02:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392 to /host/opt/cni/bin/\\\\n2026-03-20T16:02:27Z [verbose] multus-daemon started\\\\n2026-03-20T16:02:27Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.805535 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.817959 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a995f8b-228b-4ae2-925c-41f31924d374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce00a794483a44bc8052d04362f63fd12b2abe8577d8f276aadb4609c40709f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67a83c494b01d6dcf1c0ff7f564cf91d62c6e0f19fc7d4301dc4cd3f5da38b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf231961d33f62280cb7698c00ad3326849806ddc1d5bf010cb11f6cf5a27c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.829238 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.842191 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.853128 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.867225 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.890231 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"t-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 16:02:54.133730 6975 services_controller.go:445] Built service openshift-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 16:02:54.133755 6975 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:54.133762 6975 services_controller.go:451] Built service openshift-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protoc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.901795 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.916194 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.932526 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.951711 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.966740 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.983964 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:23 crc kubenswrapper[4708]: I0320 16:03:23.997502 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:23Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.012403 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.025857 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.039437 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.110109 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:24 crc kubenswrapper[4708]: E0320 16:03:24.110253 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.110298 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:24 crc kubenswrapper[4708]: E0320 16:03:24.110510 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.757885 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/3.log" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.758330 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/2.log" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.760565 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c" exitCode=1 Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.760599 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c"} Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.760631 4708 scope.go:117] "RemoveContainer" containerID="36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.761318 4708 scope.go:117] "RemoveContainer" containerID="ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c" Mar 20 16:03:24 crc kubenswrapper[4708]: E0320 16:03:24.761446 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.787924 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.805514 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a995f8b-228b-4ae2-925c-41f31924d374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce00a794483a44bc8052d04362f63fd12b2abe8577d8f276aadb4609c40709f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67a83c494b01d6dcf1c0ff7f564cf91d62c6e0f19fc7d4301dc4cd3f5da38b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf231961d33f62280cb7698c00ad3326849806ddc1d5bf010cb11f6cf5a27c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.824930 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.840272 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"message\\\":\\\"2026-03-20T16:02:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392\\\\n2026-03-20T16:02:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392 to /host/opt/cni/bin/\\\\n2026-03-20T16:02:27Z [verbose] multus-daemon started\\\\n2026-03-20T16:02:27Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.851077 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.861402 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.888923 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ac2a7ed7dfb4494bcb925faaf8785fda975337828cb3b3d25694faac88f2c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:02:54Z\\\",\\\"message\\\":\\\"t-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 16:02:54.133730 6975 services_controller.go:445] Built service openshift-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 16:02:54.133755 6975 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:02:54Z is after 2025-08-24T17:21:41Z]\\\\nI0320 16:02:54.133762 6975 services_controller.go:451] Built service openshift-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protoc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:23Z\\\",\\\"message\\\":\\\"s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0320 16:03:23.942303 7298 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 2.946721ms\\\\nI0320 16:03:23.942280 7298 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 16:03:23.942337 7298 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 16:03:23.942331 7298 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:03:23.942424 7298 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 16:03:23.942436 7298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 16:03:23.942449 7298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 16:03:23.942470 7298 factory.go:656] Stopping watch factory\\\\nI0320 16:03:23.942509 7298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 16:03:23.942884 7298 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 16:03:23.943062 7298 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:23.943154 7298 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:23.943220 7298 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:23.943343 7298 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.901884 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.914006 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.926412 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.937041 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.948821 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.963295 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.976271 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.986280 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:24 crc kubenswrapper[4708]: I0320 16:03:24.997728 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:24Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.009741 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.024798 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.036713 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.109927 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.110022 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:25 crc kubenswrapper[4708]: E0320 16:03:25.110080 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:25 crc kubenswrapper[4708]: E0320 16:03:25.110276 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.766542 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/3.log" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.771079 4708 scope.go:117] "RemoveContainer" containerID="ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c" Mar 20 16:03:25 crc kubenswrapper[4708]: E0320 16:03:25.771281 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.784143 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a995f8b-228b-4ae2-925c-41f31924d374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce00a794483a44bc8052d04362f63fd12b2abe8577d8f276aadb4609c40709f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67a83c494b01d6dcf1c0ff7f564cf91d62c6e0f19fc7d4301dc4cd3f5da38b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf231961d33f62280cb7698c00ad3326849806ddc1d5bf010cb11f6cf5a27c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.804307 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.819504 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"message\\\":\\\"2026-03-20T16:02:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392\\\\n2026-03-20T16:02:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392 to /host/opt/cni/bin/\\\\n2026-03-20T16:02:27Z [verbose] multus-daemon started\\\\n2026-03-20T16:02:27Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.834846 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.847788 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.857866 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.871109 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.886414 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.900173 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.913290 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.928821 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.950030 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:23Z\\\",\\\"message\\\":\\\"s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0320 16:03:23.942303 7298 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 2.946721ms\\\\nI0320 16:03:23.942280 7298 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 16:03:23.942337 7298 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 16:03:23.942331 7298 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:03:23.942424 7298 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 16:03:23.942436 7298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 16:03:23.942449 7298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 16:03:23.942470 7298 factory.go:656] Stopping watch factory\\\\nI0320 16:03:23.942509 7298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 16:03:23.942884 7298 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 16:03:23.943062 7298 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:23.943154 7298 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:23.943220 7298 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:23.943343 7298 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.962592 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.979473 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.994417 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.994487 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.994506 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.994534 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.994555 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:25Z","lastTransitionTime":"2026-03-20T16:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:25 crc kubenswrapper[4708]: I0320 16:03:25.995131 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:25Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.010878 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: E0320 16:03:26.011923 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.016667 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.016728 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.016740 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.016757 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.016767 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:26Z","lastTransitionTime":"2026-03-20T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.027421 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: E0320 16:03:26.027519 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.031969 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.032019 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.032030 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.032048 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.032060 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:26Z","lastTransitionTime":"2026-03-20T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.044297 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: E0320 16:03:26.046575 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.050298 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.050349 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.050359 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.050381 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.050392 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:26Z","lastTransitionTime":"2026-03-20T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.060725 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: E0320 16:03:26.062485 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.067042 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.067102 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.067122 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.067141 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.067154 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:26Z","lastTransitionTime":"2026-03-20T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:26 crc kubenswrapper[4708]: E0320 16:03:26.092445 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: E0320 16:03:26.092762 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.110559 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:26 crc kubenswrapper[4708]: E0320 16:03:26.111913 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.110614 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:26 crc kubenswrapper[4708]: E0320 16:03:26.112366 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.125443 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.145100 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.157431 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.173191 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.185480 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.201759 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.216098 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.231476 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a995f8b-228b-4ae2-925c-41f31924d374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce00a794483a44bc8052d04362f63fd12b2abe8577d8f276aadb4609c40709f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67a83c494b01d6dcf1c0ff7f564cf91d62c6e0f19fc7d4301dc4cd3f5da38b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf231961d33f62280cb7698c00ad3326849806ddc1d5bf010cb11f6cf5a27c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: E0320 16:03:26.253849 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.263047 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.282015 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"message\\\":\\\"2026-03-20T16:02:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392\\\\n2026-03-20T16:02:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392 to /host/opt/cni/bin/\\\\n2026-03-20T16:02:27Z [verbose] multus-daemon started\\\\n2026-03-20T16:02:27Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.299340 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.317596 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.332367 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.344135 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.356027 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.366153 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.377125 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.398474 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:23Z\\\",\\\"message\\\":\\\"s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0320 16:03:23.942303 7298 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 2.946721ms\\\\nI0320 16:03:23.942280 7298 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 16:03:23.942337 7298 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 16:03:23.942331 7298 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:03:23.942424 7298 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 16:03:23.942436 7298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 16:03:23.942449 7298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 16:03:23.942470 7298 factory.go:656] Stopping watch factory\\\\nI0320 16:03:23.942509 7298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 16:03:23.942884 7298 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 16:03:23.943062 7298 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:23.943154 7298 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:23.943220 7298 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:23.943343 7298 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:26 crc kubenswrapper[4708]: I0320 16:03:26.408719 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:26Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:27 crc kubenswrapper[4708]: I0320 16:03:27.110211 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:27 crc kubenswrapper[4708]: I0320 16:03:27.110252 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:27 crc kubenswrapper[4708]: E0320 16:03:27.110358 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:27 crc kubenswrapper[4708]: E0320 16:03:27.110534 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:28 crc kubenswrapper[4708]: I0320 16:03:28.110001 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:28 crc kubenswrapper[4708]: E0320 16:03:28.110121 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:28 crc kubenswrapper[4708]: I0320 16:03:28.110172 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:28 crc kubenswrapper[4708]: E0320 16:03:28.110355 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:29 crc kubenswrapper[4708]: I0320 16:03:29.110066 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:29 crc kubenswrapper[4708]: I0320 16:03:29.110171 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:29 crc kubenswrapper[4708]: E0320 16:03:29.110215 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:29 crc kubenswrapper[4708]: E0320 16:03:29.110408 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:30 crc kubenswrapper[4708]: I0320 16:03:30.111026 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:30 crc kubenswrapper[4708]: I0320 16:03:30.111069 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:30 crc kubenswrapper[4708]: E0320 16:03:30.111211 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:30 crc kubenswrapper[4708]: E0320 16:03:30.111378 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:31 crc kubenswrapper[4708]: I0320 16:03:31.109987 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:31 crc kubenswrapper[4708]: I0320 16:03:31.110011 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:31 crc kubenswrapper[4708]: E0320 16:03:31.110170 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:31 crc kubenswrapper[4708]: E0320 16:03:31.110306 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:31 crc kubenswrapper[4708]: E0320 16:03:31.255482 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:32 crc kubenswrapper[4708]: I0320 16:03:32.111058 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:32 crc kubenswrapper[4708]: I0320 16:03:32.111126 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:32 crc kubenswrapper[4708]: E0320 16:03:32.111253 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:32 crc kubenswrapper[4708]: E0320 16:03:32.111392 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:33 crc kubenswrapper[4708]: I0320 16:03:33.110242 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:33 crc kubenswrapper[4708]: I0320 16:03:33.110313 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:33 crc kubenswrapper[4708]: E0320 16:03:33.110564 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:33 crc kubenswrapper[4708]: E0320 16:03:33.111509 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:34 crc kubenswrapper[4708]: I0320 16:03:34.110334 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:34 crc kubenswrapper[4708]: E0320 16:03:34.111083 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:34 crc kubenswrapper[4708]: I0320 16:03:34.110374 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:34 crc kubenswrapper[4708]: E0320 16:03:34.111233 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:35 crc kubenswrapper[4708]: I0320 16:03:35.110809 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:35 crc kubenswrapper[4708]: I0320 16:03:35.110870 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:35 crc kubenswrapper[4708]: E0320 16:03:35.110989 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:35 crc kubenswrapper[4708]: E0320 16:03:35.111147 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.110879 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.110956 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:36 crc kubenswrapper[4708]: E0320 16:03:36.111020 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:36 crc kubenswrapper[4708]: E0320 16:03:36.111122 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.122836 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.139003 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.152067 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.166448 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.176079 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.186781 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.203747 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:23Z\\\",\\\"message\\\":\\\"s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0320 16:03:23.942303 7298 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 2.946721ms\\\\nI0320 16:03:23.942280 7298 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 16:03:23.942337 7298 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 16:03:23.942331 7298 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:03:23.942424 7298 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 16:03:23.942436 7298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 16:03:23.942449 7298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 16:03:23.942470 7298 factory.go:656] Stopping watch factory\\\\nI0320 16:03:23.942509 7298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 16:03:23.942884 7298 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 16:03:23.943062 7298 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:23.943154 7298 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:23.943220 7298 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:23.943343 7298 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.215215 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.224212 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.236222 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.248056 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: E0320 16:03:36.255996 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.263722 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.274303 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.284770 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.295033 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.304513 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a995f8b-228b-4ae2-925c-41f31924d374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce00a794483a44bc8052d04362f63fd12b2abe8577d8f276aadb4609c40709f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67a83c494b01d6dcf1c0ff7f564cf91d62c6e0f19fc7d4301dc4cd3f5da38b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf231961d33f62280cb7698c00ad3326849806ddc1d5bf010cb11f6cf5a27c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.320733 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.332575 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"message\\\":\\\"2026-03-20T16:02:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392\\\\n2026-03-20T16:02:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392 to /host/opt/cni/bin/\\\\n2026-03-20T16:02:27Z [verbose] multus-daemon started\\\\n2026-03-20T16:02:27Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.345221 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.362657 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.362707 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.362717 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.362734 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.362746 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:36Z","lastTransitionTime":"2026-03-20T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:36 crc kubenswrapper[4708]: E0320 16:03:36.373930 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.377896 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.377939 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.377949 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.377966 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.377976 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:36Z","lastTransitionTime":"2026-03-20T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:36 crc kubenswrapper[4708]: E0320 16:03:36.389328 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.392794 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.392832 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.392841 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.392857 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.392868 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:36Z","lastTransitionTime":"2026-03-20T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:36 crc kubenswrapper[4708]: E0320 16:03:36.402805 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.405872 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.405918 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.405932 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.405952 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.405964 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:36Z","lastTransitionTime":"2026-03-20T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:36 crc kubenswrapper[4708]: E0320 16:03:36.416856 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.420047 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.420206 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.420322 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.420440 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:36 crc kubenswrapper[4708]: I0320 16:03:36.420545 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:36Z","lastTransitionTime":"2026-03-20T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:36 crc kubenswrapper[4708]: E0320 16:03:36.436122 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:36Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:36 crc kubenswrapper[4708]: E0320 16:03:36.436241 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:03:37 crc kubenswrapper[4708]: I0320 16:03:37.110706 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:37 crc kubenswrapper[4708]: I0320 16:03:37.110759 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:37 crc kubenswrapper[4708]: E0320 16:03:37.111136 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:37 crc kubenswrapper[4708]: E0320 16:03:37.111323 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:38 crc kubenswrapper[4708]: I0320 16:03:38.111024 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:38 crc kubenswrapper[4708]: I0320 16:03:38.111133 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:38 crc kubenswrapper[4708]: E0320 16:03:38.111180 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:38 crc kubenswrapper[4708]: E0320 16:03:38.111251 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:39 crc kubenswrapper[4708]: I0320 16:03:39.110848 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:39 crc kubenswrapper[4708]: I0320 16:03:39.110908 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:39 crc kubenswrapper[4708]: E0320 16:03:39.111052 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:39 crc kubenswrapper[4708]: E0320 16:03:39.111190 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:40 crc kubenswrapper[4708]: I0320 16:03:40.110811 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:40 crc kubenswrapper[4708]: I0320 16:03:40.110815 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:40 crc kubenswrapper[4708]: E0320 16:03:40.111160 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:40 crc kubenswrapper[4708]: E0320 16:03:40.111323 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:41 crc kubenswrapper[4708]: I0320 16:03:41.110865 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:41 crc kubenswrapper[4708]: I0320 16:03:41.110942 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:41 crc kubenswrapper[4708]: E0320 16:03:41.111505 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:41 crc kubenswrapper[4708]: E0320 16:03:41.111600 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:41 crc kubenswrapper[4708]: I0320 16:03:41.111767 4708 scope.go:117] "RemoveContainer" containerID="ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c" Mar 20 16:03:41 crc kubenswrapper[4708]: E0320 16:03:41.111978 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" Mar 20 16:03:41 crc kubenswrapper[4708]: E0320 16:03:41.257477 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:42 crc kubenswrapper[4708]: I0320 16:03:42.110942 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:42 crc kubenswrapper[4708]: I0320 16:03:42.110945 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:42 crc kubenswrapper[4708]: E0320 16:03:42.111162 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:42 crc kubenswrapper[4708]: E0320 16:03:42.111225 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:42 crc kubenswrapper[4708]: I0320 16:03:42.676533 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:42 crc kubenswrapper[4708]: E0320 16:03:42.676748 4708 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:42 crc kubenswrapper[4708]: E0320 16:03:42.676819 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs podName:3574461f-8c2b-446b-a2f1-c1be3a8d7824 nodeName:}" failed. No retries permitted until 2026-03-20 16:04:46.676801546 +0000 UTC m=+241.351138271 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs") pod "network-metrics-daemon-gtlzm" (UID: "3574461f-8c2b-446b-a2f1-c1be3a8d7824") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 16:03:43 crc kubenswrapper[4708]: I0320 16:03:43.110570 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:43 crc kubenswrapper[4708]: I0320 16:03:43.110663 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:43 crc kubenswrapper[4708]: E0320 16:03:43.110794 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:43 crc kubenswrapper[4708]: E0320 16:03:43.111026 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:44 crc kubenswrapper[4708]: I0320 16:03:44.111254 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:44 crc kubenswrapper[4708]: E0320 16:03:44.111425 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:44 crc kubenswrapper[4708]: I0320 16:03:44.111262 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:44 crc kubenswrapper[4708]: E0320 16:03:44.111757 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:45 crc kubenswrapper[4708]: I0320 16:03:45.109905 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:45 crc kubenswrapper[4708]: I0320 16:03:45.109912 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:45 crc kubenswrapper[4708]: E0320 16:03:45.110074 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:45 crc kubenswrapper[4708]: E0320 16:03:45.110122 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.116630 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:46 crc kubenswrapper[4708]: E0320 16:03:46.116965 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.118469 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:46 crc kubenswrapper[4708]: E0320 16:03:46.118845 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.136738 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcb5615-9fde-435c-9581-38417dd8d40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3bb84100ee666236ddbe82e0cb55ea08d09153f9401fa5ae51ae19ba6342c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93eadd025b924d042013c9914b54687d1b048d4f1b76962a23186e6c7b1d39b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.160458 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46f2e587-1a2b-476f-aaf1-a95fec8e0434\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 16:01:51.712444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 16:01:51.712621 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 16:01:51.713578 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3640425149/tls.crt::/tmp/serving-cert-3640425149/tls.key\\\\\\\"\\\\nI0320 16:01:52.164064 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 16:01:52.166728 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 16:01:52.166748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 16:01:52.166774 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 16:01:52.166788 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 16:01:52.175705 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 16:01:52.175720 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 16:01:52.175727 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175734 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 16:01:52.175738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 16:01:52.175741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 16:01:52.175744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 16:01:52.175746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 16:01:52.178830 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:01:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.179770 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.201491 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621657ff84a2c3ade65268461574326317cf772d5b47ff0ac2a2f0ae122338cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a34664ccecdb0b52c6587db2edc755545e167683a7aa918d780e96612fb71577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.223763 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774d7577b00ba824b703d73d4afb326f7d8af770810dfd68b1137ee3de1e2604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.252526 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdf22413d064f3ebd119f74df242f6cd669a7cce9d06f3715a8ffc7a9c3bcff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: E0320 16:03:46.259106 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.274054 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a798e17b-98de-4215-abe5-82adf76e66ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb093c15a6e173db65e98c8da748c9f34b07cbe481cd56fceb1041c08a565667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27e84b0fbbf466b43038e978d2965976d4e738fc3173f33adf6dc7077199cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lwld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68gt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.293368 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a995f8b-228b-4ae2-925c-41f31924d374\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce00a794483a44bc8052d04362f63fd12b2abe8577d8f276aadb4609c40709f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67a83c494b01d6dcf1c0ff7f564cf91d62c6e0f19fc7d4301dc4cd3f5da38b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf231961d33f62280cb7698c00ad3326849806ddc1d5bf010cb11f6cf5a27c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab1085ad15109ea9e565d9e3b035bc41b8e5beca910d0fa929a6b2294f61a0e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.324610 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc24d522-16e8-45e2-b784-bc8bd8793d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced7967ab07341d6d9a32883a830e210233c7ef3c2b9abefb260d2244fff0ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee1abfa12646b3a6005f4ddccf1f0bbd7908faa245b2077587e121fe1dff3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4667f0008a2f50345492fdccf1b11fdf7ec77a42eddd38a693be445de9b6feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fce60cd4607846fa7094bcf70209dd40a571b8487f30846e8091e464b821bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2330e0dd5a5fce954a1894ccfbd5804f24f69e9c5c52755f1d9e856acad46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a8e479d60b035895c7af9152cfdf39e85bd9a6b800fd4b9be00a1474cf91c96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5f058cbf00d5d6d0481ac399587574a46b468448894de030e628e3068113493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1763441cf7543f4d0a704f105b2f1ffe1129df87c5099ce545a2ccd2fff2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.343612 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8kspl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f49a68df-98d0-464f-b40e-0aba2faab528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:12Z\\\",\\\"message\\\":\\\"2026-03-20T16:02:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392\\\\n2026-03-20T16:02:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e0c4880c-5e1b-4fc9-ae3b-f1bf43be4392 to /host/opt/cni/bin/\\\\n2026-03-20T16:02:27Z [verbose] multus-daemon started\\\\n2026-03-20T16:02:27Z [verbose] Readiness Indicator file check\\\\n2026-03-20T16:03:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:03:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh9ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8kspl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.368383 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m98sv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9ddc889-df44-41f4-bb84-bb103bf9695a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://642ba9b8760d9346f95a60bd2fb17a94ea7d6765a6baa73da102ac466a0dd18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10dd8d189416d5bd0848096fdc5e49ec091fddadd26d67cebb9b8f77a4a3b2de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f36175a42f64252157fbdb38a0d8d11a07a357234af264113a00999f5567e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b278f6a9fe12a0211c0d61e8cbfb8831da30a61b339a9bc9617a772ad129d4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ba5cedfad8418153ddde2f12ae1167cc202bc2e716be2fa3f95545b71fd89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e046f70457cb2fa6d2ef94c5d92d715a9f0ba50f3bbf7208602ca0bf0600284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b92447994f9810c40796321530f738938fbafa45c8f2c6f861543370745d397\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cfdz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m98sv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.383403 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3574461f-8c2b-446b-a2f1-c1be3a8d7824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnkl7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gtlzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.399334 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90bd077a-57d3-4159-beb6-ea6d79eb4c15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:00:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d400a373fa5e368abbb077f012827da6058c51410e47f85442cb8ee03ebcca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89fd29011e48fe84893f9f2cd60da953266b4c5cf120b71ae7fbec971b526a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T16:01:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 16:00:48.439784 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 16:00:48.442696 1 observer_polling.go:159] Starting file observer\\\\nI0320 16:00:48.480296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 16:00:48.485588 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 16:01:18.170572 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 16:01:18.170758 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:01:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac0e0288686a8af2b4082316f321a9797b201f4a2756aded844e407dfca995e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37a00fe3f5c5fc475d7c771398a7dd28182284dd0f5f6f04eaec68f15da2dc5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:00:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:00:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.418908 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.436719 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.451061 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4jhr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83526619-9c43-409d-af71-0b8ebbe71231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7c8cbf81e60e56030a6d3cc9e8b0b10b6820001359326e12e22243a1d68e40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x788b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4jhr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.467886 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd987d1-f981-4e7a-b063-920f84a0d7f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e0954181d0014916d20d7276d6decd0b06e2c0b83e939108ca7bf987dbe0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtslg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgbv9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.490379 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T16:03:23Z\\\",\\\"message\\\":\\\"s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0320 16:03:23.942303 7298 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 2.946721ms\\\\nI0320 16:03:23.942280 7298 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 16:03:23.942337 7298 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 16:03:23.942331 7298 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 16:03:23.942424 7298 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 16:03:23.942436 7298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 16:03:23.942449 7298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 16:03:23.942470 7298 factory.go:656] Stopping watch factory\\\\nI0320 16:03:23.942509 7298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 16:03:23.942884 7298 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 16:03:23.943062 7298 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 16:03:23.943154 7298 ovnkube.go:599] Stopped ovnkube\\\\nI0320 16:03:23.943220 7298 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 16:03:23.943343 7298 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T16:03:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T16:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T16:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fwwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rcmhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.508816 4708 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-trmk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e14ba83-1670-47d0-85b5-a1e8fbdafe62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T16:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa760e1061ca7b142bcbd55042c4b23c75b742a704f75f1215640fbbadc1e431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T16:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5p8q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T16:02:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-trmk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.799520 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.799574 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.799583 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.799599 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.799609 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:46Z","lastTransitionTime":"2026-03-20T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:46 crc kubenswrapper[4708]: E0320 16:03:46.819570 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.826062 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.826130 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.826148 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.826177 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.826202 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:46Z","lastTransitionTime":"2026-03-20T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:46 crc kubenswrapper[4708]: E0320 16:03:46.846480 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.851911 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.851955 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.851969 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.851986 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.851998 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:46Z","lastTransitionTime":"2026-03-20T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:46 crc kubenswrapper[4708]: E0320 16:03:46.872030 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.877273 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.877340 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.877369 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.877406 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.877432 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:46Z","lastTransitionTime":"2026-03-20T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:46 crc kubenswrapper[4708]: E0320 16:03:46.899950 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.907884 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.908122 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.908319 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.908488 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:46 crc kubenswrapper[4708]: I0320 16:03:46.908643 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:46Z","lastTransitionTime":"2026-03-20T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:46 crc kubenswrapper[4708]: E0320 16:03:46.929429 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T16:03:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aab1a40b-9efc-47fe-9821-27ec8e6c1980\\\",\\\"systemUUID\\\":\\\"445dca2f-6b37-4b9f-94a5-2336a8fbca00\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T16:03:46Z is after 2025-08-24T17:21:41Z" Mar 20 16:03:46 crc kubenswrapper[4708]: E0320 16:03:46.929578 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:03:47 crc kubenswrapper[4708]: I0320 16:03:47.110333 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:47 crc kubenswrapper[4708]: I0320 16:03:47.110415 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:47 crc kubenswrapper[4708]: E0320 16:03:47.110564 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:47 crc kubenswrapper[4708]: E0320 16:03:47.110821 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:48 crc kubenswrapper[4708]: I0320 16:03:48.110950 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:48 crc kubenswrapper[4708]: I0320 16:03:48.111034 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:48 crc kubenswrapper[4708]: E0320 16:03:48.112101 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:48 crc kubenswrapper[4708]: E0320 16:03:48.112266 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:49 crc kubenswrapper[4708]: I0320 16:03:49.110821 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:49 crc kubenswrapper[4708]: I0320 16:03:49.110915 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:49 crc kubenswrapper[4708]: E0320 16:03:49.111013 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:49 crc kubenswrapper[4708]: E0320 16:03:49.111347 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:50 crc kubenswrapper[4708]: I0320 16:03:50.111115 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:50 crc kubenswrapper[4708]: I0320 16:03:50.111136 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:50 crc kubenswrapper[4708]: E0320 16:03:50.111354 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:50 crc kubenswrapper[4708]: E0320 16:03:50.111928 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:51 crc kubenswrapper[4708]: I0320 16:03:51.110458 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:51 crc kubenswrapper[4708]: I0320 16:03:51.110450 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:51 crc kubenswrapper[4708]: E0320 16:03:51.110661 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:51 crc kubenswrapper[4708]: E0320 16:03:51.111413 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:51 crc kubenswrapper[4708]: E0320 16:03:51.260225 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:52 crc kubenswrapper[4708]: I0320 16:03:52.111000 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:52 crc kubenswrapper[4708]: E0320 16:03:52.111224 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:52 crc kubenswrapper[4708]: I0320 16:03:52.111355 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:52 crc kubenswrapper[4708]: E0320 16:03:52.111567 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:53 crc kubenswrapper[4708]: I0320 16:03:53.110326 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:53 crc kubenswrapper[4708]: I0320 16:03:53.110341 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:53 crc kubenswrapper[4708]: E0320 16:03:53.110477 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:53 crc kubenswrapper[4708]: E0320 16:03:53.110735 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:54 crc kubenswrapper[4708]: I0320 16:03:54.110377 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:54 crc kubenswrapper[4708]: I0320 16:03:54.110487 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:54 crc kubenswrapper[4708]: E0320 16:03:54.110530 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:54 crc kubenswrapper[4708]: E0320 16:03:54.110971 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:54 crc kubenswrapper[4708]: I0320 16:03:54.111311 4708 scope.go:117] "RemoveContainer" containerID="ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c" Mar 20 16:03:54 crc kubenswrapper[4708]: E0320 16:03:54.111486 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rcmhv_openshift-ovn-kubernetes(079cc7a0-ceb7-4921-b022-bbe67ae0fad5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" Mar 20 16:03:55 crc kubenswrapper[4708]: I0320 16:03:55.110516 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:55 crc kubenswrapper[4708]: I0320 16:03:55.110523 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:55 crc kubenswrapper[4708]: E0320 16:03:55.110748 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:55 crc kubenswrapper[4708]: E0320 16:03:55.110855 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.110103 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.110235 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:56 crc kubenswrapper[4708]: E0320 16:03:56.110330 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:56 crc kubenswrapper[4708]: E0320 16:03:56.110417 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.167555 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=103.167520036 podStartE2EDuration="1m43.167520036s" podCreationTimestamp="2026-03-20 16:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:56.166651772 +0000 UTC m=+190.840988517" watchObservedRunningTime="2026-03-20 16:03:56.167520036 +0000 UTC m=+190.841856791" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.167906 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=98.167898137 podStartE2EDuration="1m38.167898137s" podCreationTimestamp="2026-03-20 16:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:56.141529966 +0000 UTC m=+190.815866721" watchObservedRunningTime="2026-03-20 16:03:56.167898137 +0000 UTC m=+190.842234892" Mar 20 16:03:56 crc kubenswrapper[4708]: E0320 16:03:56.262238 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.307971 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68gt6" podStartSLOduration=135.307945727 podStartE2EDuration="2m15.307945727s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:56.306541048 +0000 UTC m=+190.980877813" watchObservedRunningTime="2026-03-20 16:03:56.307945727 +0000 UTC m=+190.982282442" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.354773 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=100.354744963 podStartE2EDuration="1m40.354744963s" podCreationTimestamp="2026-03-20 16:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:56.352954344 +0000 UTC m=+191.027291079" watchObservedRunningTime="2026-03-20 16:03:56.354744963 +0000 UTC m=+191.029081678" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.355019 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.355013491 podStartE2EDuration="51.355013491s" podCreationTimestamp="2026-03-20 16:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:56.322049658 +0000 UTC m=+190.996386373" watchObservedRunningTime="2026-03-20 16:03:56.355013491 +0000 UTC m=+191.029350216" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.369990 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8kspl" podStartSLOduration=135.369966455 podStartE2EDuration="2m15.369966455s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:56.369555804 +0000 UTC m=+191.043892519" watchObservedRunningTime="2026-03-20 16:03:56.369966455 +0000 UTC m=+191.044303170" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.444088 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m98sv" podStartSLOduration=135.444073549 podStartE2EDuration="2m15.444073549s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:56.405044637 +0000 UTC m=+191.079381342" watchObservedRunningTime="2026-03-20 16:03:56.444073549 +0000 UTC m=+191.118410264" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.452308 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-trmk9" podStartSLOduration=136.452289946 podStartE2EDuration="2m16.452289946s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:56.451604637 +0000 UTC m=+191.125941352" watchObservedRunningTime="2026-03-20 16:03:56.452289946 +0000 UTC m=+191.126626661" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.479946 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=61.479927151 podStartE2EDuration="1m1.479927151s" podCreationTimestamp="2026-03-20 16:02:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:56.47950368 +0000 UTC m=+191.153840395" watchObservedRunningTime="2026-03-20 16:03:56.479927151 +0000 UTC m=+191.154263866" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.532950 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4jhr8" podStartSLOduration=136.53293151 podStartE2EDuration="2m16.53293151s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:56.532874829 +0000 UTC m=+191.207211544" watchObservedRunningTime="2026-03-20 16:03:56.53293151 +0000 UTC m=+191.207268225" Mar 20 16:03:56 crc kubenswrapper[4708]: I0320 16:03:56.554662 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podStartSLOduration=135.554626582 podStartE2EDuration="2m15.554626582s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:56.554209869 +0000 UTC m=+191.228546604" watchObservedRunningTime="2026-03-20 16:03:56.554626582 +0000 UTC m=+191.228963307" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.110277 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.110277 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:57 crc kubenswrapper[4708]: E0320 16:03:57.110906 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:57 crc kubenswrapper[4708]: E0320 16:03:57.110780 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.254354 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.254423 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.254445 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.254473 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.254494 4708 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T16:03:57Z","lastTransitionTime":"2026-03-20T16:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.310922 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2"] Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.311283 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.313803 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.314221 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.314286 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.316095 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.351812 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.351888 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.351915 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.351981 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.352028 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.368940 4708 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.380516 4708 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.453473 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.453548 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.453639 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.453719 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.453798 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.453939 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.455058 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.455650 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.465465 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.486050 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f63fcbf-f15f-4a7f-87b4-1046fc831b4d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qb5k2\" (UID: \"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.632899 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" Mar 20 16:03:57 crc kubenswrapper[4708]: W0320 16:03:57.659384 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f63fcbf_f15f_4a7f_87b4_1046fc831b4d.slice/crio-3ded98acbddc8fe49f412740d76f941de5d4d21d9a3018f3dc2c030ae56bbfb9 WatchSource:0}: Error finding container 3ded98acbddc8fe49f412740d76f941de5d4d21d9a3018f3dc2c030ae56bbfb9: Status 404 returned error can't find the container with id 3ded98acbddc8fe49f412740d76f941de5d4d21d9a3018f3dc2c030ae56bbfb9 Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.892522 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" event={"ID":"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d","Type":"ContainerStarted","Data":"9afcc66fd95ce1edebb095f25df62478c46b9ed0254bdc52ab7b208cdb3218c7"} Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.892590 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" event={"ID":"1f63fcbf-f15f-4a7f-87b4-1046fc831b4d","Type":"ContainerStarted","Data":"3ded98acbddc8fe49f412740d76f941de5d4d21d9a3018f3dc2c030ae56bbfb9"} Mar 20 16:03:57 crc kubenswrapper[4708]: I0320 16:03:57.911697 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qb5k2" podStartSLOduration=137.911630649 podStartE2EDuration="2m17.911630649s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:57.910409395 +0000 UTC m=+192.584746130" watchObservedRunningTime="2026-03-20 16:03:57.911630649 +0000 UTC m=+192.585967374" Mar 20 16:03:58 crc kubenswrapper[4708]: I0320 16:03:58.110438 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:03:58 crc kubenswrapper[4708]: I0320 16:03:58.110438 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:03:58 crc kubenswrapper[4708]: E0320 16:03:58.110611 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:03:58 crc kubenswrapper[4708]: E0320 16:03:58.110737 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:03:59 crc kubenswrapper[4708]: I0320 16:03:59.110545 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:03:59 crc kubenswrapper[4708]: I0320 16:03:59.110636 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:03:59 crc kubenswrapper[4708]: E0320 16:03:59.110775 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:03:59 crc kubenswrapper[4708]: E0320 16:03:59.110957 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:03:59 crc kubenswrapper[4708]: I0320 16:03:59.900614 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8kspl_f49a68df-98d0-464f-b40e-0aba2faab528/kube-multus/1.log" Mar 20 16:03:59 crc kubenswrapper[4708]: I0320 16:03:59.901509 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8kspl_f49a68df-98d0-464f-b40e-0aba2faab528/kube-multus/0.log" Mar 20 16:03:59 crc kubenswrapper[4708]: I0320 16:03:59.901564 4708 generic.go:334] "Generic (PLEG): container finished" podID="f49a68df-98d0-464f-b40e-0aba2faab528" containerID="2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b" exitCode=1 Mar 20 16:03:59 crc kubenswrapper[4708]: I0320 16:03:59.901598 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8kspl" event={"ID":"f49a68df-98d0-464f-b40e-0aba2faab528","Type":"ContainerDied","Data":"2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b"} Mar 20 16:03:59 crc kubenswrapper[4708]: I0320 16:03:59.901639 4708 scope.go:117] "RemoveContainer" containerID="527b2cbf5ea2c1d0ef09e8cb69d86daf01c0c02defa9b720aac6760cbf187537" Mar 20 16:03:59 crc kubenswrapper[4708]: I0320 16:03:59.902489 4708 scope.go:117] "RemoveContainer" containerID="2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b" Mar 20 16:03:59 crc kubenswrapper[4708]: E0320 16:03:59.903106 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8kspl_openshift-multus(f49a68df-98d0-464f-b40e-0aba2faab528)\"" pod="openshift-multus/multus-8kspl" podUID="f49a68df-98d0-464f-b40e-0aba2faab528" Mar 20 16:04:00 crc kubenswrapper[4708]: I0320 16:04:00.111153 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:00 crc kubenswrapper[4708]: I0320 16:04:00.111197 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:00 crc kubenswrapper[4708]: E0320 16:04:00.111414 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:00 crc kubenswrapper[4708]: E0320 16:04:00.111575 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:04:00 crc kubenswrapper[4708]: I0320 16:04:00.907544 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8kspl_f49a68df-98d0-464f-b40e-0aba2faab528/kube-multus/1.log" Mar 20 16:04:01 crc kubenswrapper[4708]: I0320 16:04:01.110861 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:01 crc kubenswrapper[4708]: I0320 16:04:01.110983 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:01 crc kubenswrapper[4708]: E0320 16:04:01.111121 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:01 crc kubenswrapper[4708]: E0320 16:04:01.111363 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:01 crc kubenswrapper[4708]: E0320 16:04:01.264234 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:04:02 crc kubenswrapper[4708]: I0320 16:04:02.110750 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:02 crc kubenswrapper[4708]: I0320 16:04:02.110775 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:02 crc kubenswrapper[4708]: E0320 16:04:02.110987 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:04:02 crc kubenswrapper[4708]: E0320 16:04:02.111169 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:03 crc kubenswrapper[4708]: I0320 16:04:03.111201 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:03 crc kubenswrapper[4708]: E0320 16:04:03.111455 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:03 crc kubenswrapper[4708]: I0320 16:04:03.111201 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:03 crc kubenswrapper[4708]: E0320 16:04:03.111737 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:04 crc kubenswrapper[4708]: I0320 16:04:04.110738 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:04 crc kubenswrapper[4708]: I0320 16:04:04.110781 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:04 crc kubenswrapper[4708]: E0320 16:04:04.111186 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:04:04 crc kubenswrapper[4708]: E0320 16:04:04.111412 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:05 crc kubenswrapper[4708]: I0320 16:04:05.110373 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:05 crc kubenswrapper[4708]: I0320 16:04:05.110418 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:05 crc kubenswrapper[4708]: E0320 16:04:05.110522 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:05 crc kubenswrapper[4708]: E0320 16:04:05.110650 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:06 crc kubenswrapper[4708]: I0320 16:04:06.110217 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:06 crc kubenswrapper[4708]: I0320 16:04:06.110236 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:06 crc kubenswrapper[4708]: E0320 16:04:06.111745 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:04:06 crc kubenswrapper[4708]: E0320 16:04:06.111968 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:06 crc kubenswrapper[4708]: E0320 16:04:06.264845 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:04:07 crc kubenswrapper[4708]: I0320 16:04:07.110554 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:07 crc kubenswrapper[4708]: I0320 16:04:07.110554 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:07 crc kubenswrapper[4708]: E0320 16:04:07.111233 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:07 crc kubenswrapper[4708]: E0320 16:04:07.111307 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:08 crc kubenswrapper[4708]: I0320 16:04:08.110825 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:08 crc kubenswrapper[4708]: I0320 16:04:08.110851 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:08 crc kubenswrapper[4708]: E0320 16:04:08.111034 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:04:08 crc kubenswrapper[4708]: E0320 16:04:08.111110 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:08 crc kubenswrapper[4708]: I0320 16:04:08.113534 4708 scope.go:117] "RemoveContainer" containerID="ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c" Mar 20 16:04:08 crc kubenswrapper[4708]: I0320 16:04:08.938151 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/3.log" Mar 20 16:04:08 crc kubenswrapper[4708]: I0320 16:04:08.940935 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerStarted","Data":"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c"} Mar 20 16:04:08 crc kubenswrapper[4708]: I0320 16:04:08.995747 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podStartSLOduration=147.995727321 podStartE2EDuration="2m27.995727321s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:08.995167436 +0000 UTC m=+203.669504191" watchObservedRunningTime="2026-03-20 16:04:08.995727321 +0000 UTC m=+203.670064086" Mar 20 16:04:09 crc kubenswrapper[4708]: I0320 16:04:09.007726 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gtlzm"] Mar 20 16:04:09 crc kubenswrapper[4708]: I0320 16:04:09.007907 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:09 crc kubenswrapper[4708]: E0320 16:04:09.008053 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:04:09 crc kubenswrapper[4708]: I0320 16:04:09.111025 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:09 crc kubenswrapper[4708]: I0320 16:04:09.111066 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:09 crc kubenswrapper[4708]: E0320 16:04:09.112603 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:09 crc kubenswrapper[4708]: E0320 16:04:09.112706 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:10 crc kubenswrapper[4708]: I0320 16:04:10.110033 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:10 crc kubenswrapper[4708]: I0320 16:04:10.110033 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:10 crc kubenswrapper[4708]: E0320 16:04:10.111148 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:10 crc kubenswrapper[4708]: E0320 16:04:10.111219 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:04:11 crc kubenswrapper[4708]: I0320 16:04:11.110963 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:11 crc kubenswrapper[4708]: I0320 16:04:11.111011 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:11 crc kubenswrapper[4708]: E0320 16:04:11.112113 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:11 crc kubenswrapper[4708]: E0320 16:04:11.112291 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:11 crc kubenswrapper[4708]: E0320 16:04:11.266372 4708 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:04:12 crc kubenswrapper[4708]: I0320 16:04:12.110887 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:12 crc kubenswrapper[4708]: I0320 16:04:12.110887 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:12 crc kubenswrapper[4708]: E0320 16:04:12.111091 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:12 crc kubenswrapper[4708]: E0320 16:04:12.111228 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:04:12 crc kubenswrapper[4708]: I0320 16:04:12.112767 4708 scope.go:117] "RemoveContainer" containerID="2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b" Mar 20 16:04:12 crc kubenswrapper[4708]: I0320 16:04:12.960940 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8kspl_f49a68df-98d0-464f-b40e-0aba2faab528/kube-multus/1.log" Mar 20 16:04:12 crc kubenswrapper[4708]: I0320 16:04:12.961014 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8kspl" event={"ID":"f49a68df-98d0-464f-b40e-0aba2faab528","Type":"ContainerStarted","Data":"6194e56dac24e25230c92b6148f9a7bc07ff22fcbb4939993823336fbeddcc7a"} Mar 20 16:04:13 crc kubenswrapper[4708]: I0320 16:04:13.110387 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:13 crc kubenswrapper[4708]: I0320 16:04:13.110530 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:13 crc kubenswrapper[4708]: E0320 16:04:13.110636 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:13 crc kubenswrapper[4708]: E0320 16:04:13.110729 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:14 crc kubenswrapper[4708]: I0320 16:04:14.111111 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:14 crc kubenswrapper[4708]: I0320 16:04:14.111268 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.111365 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.111487 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:04:14 crc kubenswrapper[4708]: I0320 16:04:14.169076 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:14 crc kubenswrapper[4708]: I0320 16:04:14.169237 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.169330 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:06:16.169284005 +0000 UTC m=+330.843620760 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.169357 4708 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:04:14 crc kubenswrapper[4708]: I0320 16:04:14.169426 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.169492 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:06:16.16946711 +0000 UTC m=+330.843803865 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 16:04:14 crc kubenswrapper[4708]: I0320 16:04:14.169526 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:14 crc kubenswrapper[4708]: I0320 16:04:14.169591 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.169772 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.169800 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.169823 4708 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.169860 4708 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.169878 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.169933 4708 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.169947 4708 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.169889 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 16:06:16.169868993 +0000 UTC m=+330.844205748 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.170077 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 16:06:16.170015867 +0000 UTC m=+330.844352632 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 16:04:14 crc kubenswrapper[4708]: E0320 16:04:14.170132 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 16:06:16.170111509 +0000 UTC m=+330.844448264 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 16:04:15 crc kubenswrapper[4708]: I0320 16:04:15.110188 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:15 crc kubenswrapper[4708]: E0320 16:04:15.110349 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 16:04:15 crc kubenswrapper[4708]: I0320 16:04:15.110467 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:15 crc kubenswrapper[4708]: E0320 16:04:15.110711 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 16:04:16 crc kubenswrapper[4708]: I0320 16:04:16.110397 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:16 crc kubenswrapper[4708]: I0320 16:04:16.110396 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:16 crc kubenswrapper[4708]: E0320 16:04:16.111212 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gtlzm" podUID="3574461f-8c2b-446b-a2f1-c1be3a8d7824" Mar 20 16:04:16 crc kubenswrapper[4708]: E0320 16:04:16.111366 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.110426 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.110628 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.114206 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.114330 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.114541 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.114648 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.627011 4708 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.678131 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j74qm"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.678882 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.679509 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.680230 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.685144 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-r95lq"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.685836 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.687793 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-q9z2q"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.688474 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.690502 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.691930 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.695315 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rz6l8"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.696227 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rsr2h"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.696727 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.697288 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.697515 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.698446 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.701309 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.702152 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.702594 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.703331 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.703506 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.704002 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.704172 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.704333 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2bgzp"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.704846 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.705193 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.704351 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.705707 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.705737 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2bgzp" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.705922 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.706191 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.706422 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.706598 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.704388 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.707052 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.704422 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.705043 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.707444 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.705096 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.707603 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.705128 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.707755 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.705312 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.703359 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.708022 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.708257 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.708453 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.711531 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.713942 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.716910 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.717451 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.719992 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-kdrms"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.729497 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.732487 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.732577 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.732781 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.732800 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.732883 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.733062 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.733238 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.733348 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.733419 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.733464 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.733605 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.733775 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.733940 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.734094 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.734239 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.734364 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.734389 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.735282 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.735445 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.742046 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.743713 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.743821 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.743902 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.743947 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744005 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744106 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744140 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744162 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744219 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744288 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.743947 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744305 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744316 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744431 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744478 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744544 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744551 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744646 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.744655 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.761152 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.766009 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.788225 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.789522 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.790354 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.790785 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.791484 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpk69"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.791916 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.792067 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qh88t"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.792261 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.792341 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.792651 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qh88t" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.797107 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.802931 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-762wg"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.803648 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.803980 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.805117 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.806175 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.806412 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.806451 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.806533 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.806580 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.806782 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.806900 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.807029 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.807131 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.807221 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.807395 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.820225 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821085 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48933855-7a14-47ec-a83d-1787cb444869-service-ca-bundle\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821121 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/716e1008-4ee5-42c3-9b4a-5c85a53489e0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-q9z2q\" (UID: \"716e1008-4ee5-42c3-9b4a-5c85a53489e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821149 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48933855-7a14-47ec-a83d-1787cb444869-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821174 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a974d167-4fbf-4405-8acc-da1ab6e2f526-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6c49\" (UID: \"a974d167-4fbf-4405-8acc-da1ab6e2f526\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821199 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93dc952e-1f6a-4e6f-b3ce-3665b4129805-serving-cert\") pod \"openshift-config-operator-7777fb866f-gd2q5\" (UID: \"93dc952e-1f6a-4e6f-b3ce-3665b4129805\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821220 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9rkc\" (UniqueName: \"kubernetes.io/projected/fe734cb0-fdf7-45a6-9a68-a12457600931-kube-api-access-t9rkc\") pod \"openshift-apiserver-operator-796bbdcf4f-ftsws\" (UID: \"fe734cb0-fdf7-45a6-9a68-a12457600931\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821241 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-config\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821266 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvw5r\" (UniqueName: \"kubernetes.io/projected/0619cccd-a201-4a73-9d07-01ccc4eb7c84-kube-api-access-vvw5r\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821298 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbp56\" (UniqueName: \"kubernetes.io/projected/0a5fa365-0110-4a06-b2d7-cfd9b5745603-kube-api-access-fbp56\") pod \"downloads-7954f5f757-2bgzp\" (UID: \"0a5fa365-0110-4a06-b2d7-cfd9b5745603\") " pod="openshift-console/downloads-7954f5f757-2bgzp" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821319 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzx75\" (UniqueName: \"kubernetes.io/projected/48933855-7a14-47ec-a83d-1787cb444869-kube-api-access-pzx75\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821340 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821359 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-encryption-config\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821380 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821400 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40d77bdf-8222-4072-bd4b-b766e73992cc-client-ca\") pod \"route-controller-manager-6576b87f9c-r28l7\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821421 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/93dc952e-1f6a-4e6f-b3ce-3665b4129805-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gd2q5\" (UID: \"93dc952e-1f6a-4e6f-b3ce-3665b4129805\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821440 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48933855-7a14-47ec-a83d-1787cb444869-serving-cert\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821459 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-audit\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821479 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0619cccd-a201-4a73-9d07-01ccc4eb7c84-serving-cert\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821509 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2bnl\" (UniqueName: \"kubernetes.io/projected/40d77bdf-8222-4072-bd4b-b766e73992cc-kube-api-access-v2bnl\") pod \"route-controller-manager-6576b87f9c-r28l7\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821528 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-serving-cert\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821549 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-config\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821571 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb-serving-cert\") pod \"console-operator-58897d9998-rsr2h\" (UID: \"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb\") " pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821595 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-audit-dir\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821616 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb-trusted-ca\") pod \"console-operator-58897d9998-rsr2h\" (UID: \"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb\") " pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821693 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thkp7\" (UniqueName: \"kubernetes.io/projected/716e1008-4ee5-42c3-9b4a-5c85a53489e0-kube-api-access-thkp7\") pod \"machine-api-operator-5694c8668f-q9z2q\" (UID: \"716e1008-4ee5-42c3-9b4a-5c85a53489e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821717 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb-config\") pod \"console-operator-58897d9998-rsr2h\" (UID: \"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb\") " pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821739 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe734cb0-fdf7-45a6-9a68-a12457600931-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ftsws\" (UID: \"fe734cb0-fdf7-45a6-9a68-a12457600931\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821766 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbvqc\" (UniqueName: \"kubernetes.io/projected/a974d167-4fbf-4405-8acc-da1ab6e2f526-kube-api-access-bbvqc\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6c49\" (UID: \"a974d167-4fbf-4405-8acc-da1ab6e2f526\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821792 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821815 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-oauth-serving-cert\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821836 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dzlh\" (UniqueName: \"kubernetes.io/projected/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-kube-api-access-6dzlh\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821862 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-config\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821884 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/50baa880-1d72-48c6-b370-2f0094a30f23-encryption-config\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821906 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-audit-policies\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821931 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nz86\" (UniqueName: \"kubernetes.io/projected/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-kube-api-access-6nz86\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821953 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50baa880-1d72-48c6-b370-2f0094a30f23-audit-dir\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821972 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe734cb0-fdf7-45a6-9a68-a12457600931-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ftsws\" (UID: \"fe734cb0-fdf7-45a6-9a68-a12457600931\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.821993 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-trusted-ca-bundle\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822016 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/716e1008-4ee5-42c3-9b4a-5c85a53489e0-config\") pod \"machine-api-operator-5694c8668f-q9z2q\" (UID: \"716e1008-4ee5-42c3-9b4a-5c85a53489e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822043 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50baa880-1d72-48c6-b370-2f0094a30f23-serving-cert\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822066 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a974d167-4fbf-4405-8acc-da1ab6e2f526-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6c49\" (UID: \"a974d167-4fbf-4405-8acc-da1ab6e2f526\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822087 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-etcd-serving-ca\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822108 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48933855-7a14-47ec-a83d-1787cb444869-config\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822132 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40d77bdf-8222-4072-bd4b-b766e73992cc-serving-cert\") pod \"route-controller-manager-6576b87f9c-r28l7\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822153 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj27f\" (UniqueName: \"kubernetes.io/projected/4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb-kube-api-access-jj27f\") pod \"console-operator-58897d9998-rsr2h\" (UID: \"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb\") " pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822176 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-oauth-config\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822197 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50baa880-1d72-48c6-b370-2f0094a30f23-node-pullsecrets\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822219 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdj75\" (UniqueName: \"kubernetes.io/projected/93dc952e-1f6a-4e6f-b3ce-3665b4129805-kube-api-access-sdj75\") pod \"openshift-config-operator-7777fb866f-gd2q5\" (UID: \"93dc952e-1f6a-4e6f-b3ce-3665b4129805\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822261 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-client-ca\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822295 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-image-import-ca\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822317 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/50baa880-1d72-48c6-b370-2f0094a30f23-etcd-client\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822381 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-service-ca\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822532 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/716e1008-4ee5-42c3-9b4a-5c85a53489e0-images\") pod \"machine-api-operator-5694c8668f-q9z2q\" (UID: \"716e1008-4ee5-42c3-9b4a-5c85a53489e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822556 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq8ww\" (UniqueName: \"kubernetes.io/projected/50baa880-1d72-48c6-b370-2f0094a30f23-kube-api-access-pq8ww\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822590 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40d77bdf-8222-4072-bd4b-b766e73992cc-config\") pod \"route-controller-manager-6576b87f9c-r28l7\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822612 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-etcd-client\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822636 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.822659 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-serving-cert\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.826234 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-r95lq"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.826295 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tkdfr"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.834000 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.835273 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.835558 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.835710 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.835847 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.835969 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.836093 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.836293 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.836400 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.836813 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.836947 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.837070 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.837191 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.837307 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.837439 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.837555 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.852934 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.853509 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8rllx"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.853823 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.854119 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.854467 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7hgxt"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.854895 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.855177 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.855551 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.855944 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.856238 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.856254 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5mvhn"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.856552 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.894042 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.894598 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.894852 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.896627 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.901736 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.902737 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.905243 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.907109 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hgxt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.907377 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.908305 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.923083 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.923106 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.923383 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.924121 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.924605 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.924718 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.925208 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.925911 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567044-8vwl5"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.926191 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.926401 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6h42j"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.926475 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.926719 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.926847 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.927084 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.927286 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.927685 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.928334 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qcjhm"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.928858 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-8vwl5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.928955 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/474a2884-6607-4971-9808-70ec4bc3796d-etcd-ca\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.928997 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48933855-7a14-47ec-a83d-1787cb444869-service-ca-bundle\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929019 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/716e1008-4ee5-42c3-9b4a-5c85a53489e0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-q9z2q\" (UID: \"716e1008-4ee5-42c3-9b4a-5c85a53489e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929037 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48933855-7a14-47ec-a83d-1787cb444869-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929054 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwbvp\" (UniqueName: \"kubernetes.io/projected/474a2884-6607-4971-9808-70ec4bc3796d-kube-api-access-pwbvp\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929074 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a974d167-4fbf-4405-8acc-da1ab6e2f526-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6c49\" (UID: \"a974d167-4fbf-4405-8acc-da1ab6e2f526\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929090 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93dc952e-1f6a-4e6f-b3ce-3665b4129805-serving-cert\") pod \"openshift-config-operator-7777fb866f-gd2q5\" (UID: \"93dc952e-1f6a-4e6f-b3ce-3665b4129805\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929108 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9rkc\" (UniqueName: \"kubernetes.io/projected/fe734cb0-fdf7-45a6-9a68-a12457600931-kube-api-access-t9rkc\") pod \"openshift-apiserver-operator-796bbdcf4f-ftsws\" (UID: \"fe734cb0-fdf7-45a6-9a68-a12457600931\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929126 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-config\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929142 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvw5r\" (UniqueName: \"kubernetes.io/projected/0619cccd-a201-4a73-9d07-01ccc4eb7c84-kube-api-access-vvw5r\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929162 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a62ab634-c377-4878-808b-ebaaf2e87c8a-metrics-tls\") pod \"dns-operator-744455d44c-qh88t\" (UID: \"a62ab634-c377-4878-808b-ebaaf2e87c8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-qh88t" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929180 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzx75\" (UniqueName: \"kubernetes.io/projected/48933855-7a14-47ec-a83d-1787cb444869-kube-api-access-pzx75\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929197 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/474a2884-6607-4971-9808-70ec4bc3796d-etcd-service-ca\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929222 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbp56\" (UniqueName: \"kubernetes.io/projected/0a5fa365-0110-4a06-b2d7-cfd9b5745603-kube-api-access-fbp56\") pod \"downloads-7954f5f757-2bgzp\" (UID: \"0a5fa365-0110-4a06-b2d7-cfd9b5745603\") " pod="openshift-console/downloads-7954f5f757-2bgzp" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929238 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929255 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929273 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-encryption-config\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929325 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-audit\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929341 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40d77bdf-8222-4072-bd4b-b766e73992cc-client-ca\") pod \"route-controller-manager-6576b87f9c-r28l7\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929357 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/93dc952e-1f6a-4e6f-b3ce-3665b4129805-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gd2q5\" (UID: \"93dc952e-1f6a-4e6f-b3ce-3665b4129805\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929375 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48933855-7a14-47ec-a83d-1787cb444869-serving-cert\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929391 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0619cccd-a201-4a73-9d07-01ccc4eb7c84-serving-cert\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929418 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2bnl\" (UniqueName: \"kubernetes.io/projected/40d77bdf-8222-4072-bd4b-b766e73992cc-kube-api-access-v2bnl\") pod \"route-controller-manager-6576b87f9c-r28l7\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929436 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-serving-cert\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929455 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-config\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929472 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-audit-dir\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929489 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb-serving-cert\") pod \"console-operator-58897d9998-rsr2h\" (UID: \"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb\") " pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929504 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb-trusted-ca\") pod \"console-operator-58897d9998-rsr2h\" (UID: \"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb\") " pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929525 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thkp7\" (UniqueName: \"kubernetes.io/projected/716e1008-4ee5-42c3-9b4a-5c85a53489e0-kube-api-access-thkp7\") pod \"machine-api-operator-5694c8668f-q9z2q\" (UID: \"716e1008-4ee5-42c3-9b4a-5c85a53489e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929542 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb-config\") pod \"console-operator-58897d9998-rsr2h\" (UID: \"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb\") " pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929555 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe734cb0-fdf7-45a6-9a68-a12457600931-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ftsws\" (UID: \"fe734cb0-fdf7-45a6-9a68-a12457600931\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929572 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbvqc\" (UniqueName: \"kubernetes.io/projected/a974d167-4fbf-4405-8acc-da1ab6e2f526-kube-api-access-bbvqc\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6c49\" (UID: \"a974d167-4fbf-4405-8acc-da1ab6e2f526\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929591 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97a30f73-c3c7-4e96-b68b-9d92fea63eb8-webhook-cert\") pod \"packageserver-d55dfcdfc-r9jtp\" (UID: \"97a30f73-c3c7-4e96-b68b-9d92fea63eb8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929609 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929636 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-oauth-serving-cert\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929686 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dzlh\" (UniqueName: \"kubernetes.io/projected/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-kube-api-access-6dzlh\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929702 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929914 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-audit-dir\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929944 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.929995 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.930031 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.930264 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.930492 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.930181 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzgjh\" (UniqueName: \"kubernetes.io/projected/a62ab634-c377-4878-808b-ebaaf2e87c8a-kube-api-access-mzgjh\") pod \"dns-operator-744455d44c-qh88t\" (UID: \"a62ab634-c377-4878-808b-ebaaf2e87c8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-qh88t" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.930976 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48933855-7a14-47ec-a83d-1787cb444869-service-ca-bundle\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.931219 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.932036 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.932053 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4hmnl"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.932288 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.932642 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.932813 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.933143 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j74qm"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.933165 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w48dx"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.934299 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4hmnl" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.936462 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-q9z2q"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.936496 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.936514 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rsr2h"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.936526 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.936538 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2bgzp"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.936612 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.936865 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.937891 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-oauth-serving-cert\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.938618 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a974d167-4fbf-4405-8acc-da1ab6e2f526-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6c49\" (UID: \"a974d167-4fbf-4405-8acc-da1ab6e2f526\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.938740 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb-trusted-ca\") pod \"console-operator-58897d9998-rsr2h\" (UID: \"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb\") " pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.938863 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48933855-7a14-47ec-a83d-1787cb444869-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.939615 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-audit\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.939974 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-config\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.939979 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0619cccd-a201-4a73-9d07-01ccc4eb7c84-serving-cert\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.940003 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.940217 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.940319 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.940640 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-config\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.940715 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-config\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.940741 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/50baa880-1d72-48c6-b370-2f0094a30f23-encryption-config\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.940760 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-audit-policies\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.940975 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb-config\") pod \"console-operator-58897d9998-rsr2h\" (UID: \"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb\") " pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941009 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lc27\" (UniqueName: \"kubernetes.io/projected/97a30f73-c3c7-4e96-b68b-9d92fea63eb8-kube-api-access-6lc27\") pod \"packageserver-d55dfcdfc-r9jtp\" (UID: \"97a30f73-c3c7-4e96-b68b-9d92fea63eb8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941141 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941198 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nz86\" (UniqueName: \"kubernetes.io/projected/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-kube-api-access-6nz86\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941221 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/716e1008-4ee5-42c3-9b4a-5c85a53489e0-config\") pod \"machine-api-operator-5694c8668f-q9z2q\" (UID: \"716e1008-4ee5-42c3-9b4a-5c85a53489e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941275 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50baa880-1d72-48c6-b370-2f0094a30f23-audit-dir\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941292 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe734cb0-fdf7-45a6-9a68-a12457600931-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ftsws\" (UID: \"fe734cb0-fdf7-45a6-9a68-a12457600931\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941310 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-trusted-ca-bundle\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941332 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-etcd-serving-ca\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941347 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50baa880-1d72-48c6-b370-2f0094a30f23-serving-cert\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941367 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a974d167-4fbf-4405-8acc-da1ab6e2f526-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6c49\" (UID: \"a974d167-4fbf-4405-8acc-da1ab6e2f526\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941418 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-audit-policies\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941449 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48933855-7a14-47ec-a83d-1787cb444869-config\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941473 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97a30f73-c3c7-4e96-b68b-9d92fea63eb8-tmpfs\") pod \"packageserver-d55dfcdfc-r9jtp\" (UID: \"97a30f73-c3c7-4e96-b68b-9d92fea63eb8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941556 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50baa880-1d72-48c6-b370-2f0094a30f23-audit-dir\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.941972 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-config\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.942196 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe734cb0-fdf7-45a6-9a68-a12457600931-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ftsws\" (UID: \"fe734cb0-fdf7-45a6-9a68-a12457600931\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.942617 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-trusted-ca-bundle\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.942664 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48933855-7a14-47ec-a83d-1787cb444869-config\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.942742 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50baa880-1d72-48c6-b370-2f0094a30f23-node-pullsecrets\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.942762 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40d77bdf-8222-4072-bd4b-b766e73992cc-serving-cert\") pod \"route-controller-manager-6576b87f9c-r28l7\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943039 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-etcd-serving-ca\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943252 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj27f\" (UniqueName: \"kubernetes.io/projected/4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb-kube-api-access-jj27f\") pod \"console-operator-58897d9998-rsr2h\" (UID: \"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb\") " pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943283 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-oauth-config\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943302 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-image-import-ca\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943311 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/93dc952e-1f6a-4e6f-b3ce-3665b4129805-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gd2q5\" (UID: \"93dc952e-1f6a-4e6f-b3ce-3665b4129805\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943319 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdj75\" (UniqueName: \"kubernetes.io/projected/93dc952e-1f6a-4e6f-b3ce-3665b4129805-kube-api-access-sdj75\") pod \"openshift-config-operator-7777fb866f-gd2q5\" (UID: \"93dc952e-1f6a-4e6f-b3ce-3665b4129805\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943340 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-client-ca\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943441 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943710 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50baa880-1d72-48c6-b370-2f0094a30f23-node-pullsecrets\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943767 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474a2884-6607-4971-9808-70ec4bc3796d-config\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943818 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/716e1008-4ee5-42c3-9b4a-5c85a53489e0-images\") pod \"machine-api-operator-5694c8668f-q9z2q\" (UID: \"716e1008-4ee5-42c3-9b4a-5c85a53489e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943840 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/50baa880-1d72-48c6-b370-2f0094a30f23-etcd-client\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.943865 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-service-ca\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.944038 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/716e1008-4ee5-42c3-9b4a-5c85a53489e0-config\") pod \"machine-api-operator-5694c8668f-q9z2q\" (UID: \"716e1008-4ee5-42c3-9b4a-5c85a53489e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.944622 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-client-ca\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.945712 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40d77bdf-8222-4072-bd4b-b766e73992cc-client-ca\") pod \"route-controller-manager-6576b87f9c-r28l7\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.946188 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93dc952e-1f6a-4e6f-b3ce-3665b4129805-serving-cert\") pod \"openshift-config-operator-7777fb866f-gd2q5\" (UID: \"93dc952e-1f6a-4e6f-b3ce-3665b4129805\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.946284 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe734cb0-fdf7-45a6-9a68-a12457600931-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ftsws\" (UID: \"fe734cb0-fdf7-45a6-9a68-a12457600931\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.946619 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-service-ca\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.946643 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/50baa880-1d72-48c6-b370-2f0094a30f23-image-import-ca\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.946754 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq8ww\" (UniqueName: \"kubernetes.io/projected/50baa880-1d72-48c6-b370-2f0094a30f23-kube-api-access-pq8ww\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.946860 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97a30f73-c3c7-4e96-b68b-9d92fea63eb8-apiservice-cert\") pod \"packageserver-d55dfcdfc-r9jtp\" (UID: \"97a30f73-c3c7-4e96-b68b-9d92fea63eb8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.946909 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kdrms"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.946954 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40d77bdf-8222-4072-bd4b-b766e73992cc-config\") pod \"route-controller-manager-6576b87f9c-r28l7\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.947000 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/716e1008-4ee5-42c3-9b4a-5c85a53489e0-images\") pod \"machine-api-operator-5694c8668f-q9z2q\" (UID: \"716e1008-4ee5-42c3-9b4a-5c85a53489e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.947018 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-etcd-client\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.947165 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.947220 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/474a2884-6607-4971-9808-70ec4bc3796d-etcd-client\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.947363 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-serving-cert\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.947401 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/474a2884-6607-4971-9808-70ec4bc3796d-serving-cert\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.947701 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.948390 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50baa880-1d72-48c6-b370-2f0094a30f23-serving-cert\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.948481 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40d77bdf-8222-4072-bd4b-b766e73992cc-config\") pod \"route-controller-manager-6576b87f9c-r28l7\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.948538 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40d77bdf-8222-4072-bd4b-b766e73992cc-serving-cert\") pod \"route-controller-manager-6576b87f9c-r28l7\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.949105 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-encryption-config\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.949306 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb-serving-cert\") pod \"console-operator-58897d9998-rsr2h\" (UID: \"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb\") " pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.949310 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-serving-cert\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.949397 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/716e1008-4ee5-42c3-9b4a-5c85a53489e0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-q9z2q\" (UID: \"716e1008-4ee5-42c3-9b4a-5c85a53489e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.949864 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48933855-7a14-47ec-a83d-1787cb444869-serving-cert\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.949902 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-etcd-client\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.949979 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rz6l8"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.950548 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/50baa880-1d72-48c6-b370-2f0094a30f23-encryption-config\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.950715 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/50baa880-1d72-48c6-b370-2f0094a30f23-etcd-client\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.951293 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.952220 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a974d167-4fbf-4405-8acc-da1ab6e2f526-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6c49\" (UID: \"a974d167-4fbf-4405-8acc-da1ab6e2f526\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.952295 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.953261 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-762wg"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.954548 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpk69"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.955635 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-oauth-config\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.955867 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qcjhm"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.957087 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.958089 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5mvhn"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.958419 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-serving-cert\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.959137 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.964499 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.966031 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qh88t"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.979930 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7hgxt"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.982459 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.982478 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.984227 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.985640 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.986517 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.987538 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4hmnl"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.990485 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w48dx"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.992319 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tkdfr"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.997897 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p97v5"] Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.998808 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p97v5" Mar 20 16:04:17 crc kubenswrapper[4708]: I0320 16:04:17.999620 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq"] Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.000740 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6h42j"] Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.002791 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs"] Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.004476 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr"] Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.005663 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb"] Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.007007 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9"] Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.008103 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-248bf"] Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.009499 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-248bf" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.010490 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p97v5"] Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.012082 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-8vwl5"] Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.013799 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk"] Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.015071 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d"] Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.016447 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc"] Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.029885 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.043038 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048061 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97a30f73-c3c7-4e96-b68b-9d92fea63eb8-webhook-cert\") pod \"packageserver-d55dfcdfc-r9jtp\" (UID: \"97a30f73-c3c7-4e96-b68b-9d92fea63eb8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048098 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzgjh\" (UniqueName: \"kubernetes.io/projected/a62ab634-c377-4878-808b-ebaaf2e87c8a-kube-api-access-mzgjh\") pod \"dns-operator-744455d44c-qh88t\" (UID: \"a62ab634-c377-4878-808b-ebaaf2e87c8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-qh88t" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048127 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lc27\" (UniqueName: \"kubernetes.io/projected/97a30f73-c3c7-4e96-b68b-9d92fea63eb8-kube-api-access-6lc27\") pod \"packageserver-d55dfcdfc-r9jtp\" (UID: \"97a30f73-c3c7-4e96-b68b-9d92fea63eb8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048161 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97a30f73-c3c7-4e96-b68b-9d92fea63eb8-tmpfs\") pod \"packageserver-d55dfcdfc-r9jtp\" (UID: \"97a30f73-c3c7-4e96-b68b-9d92fea63eb8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048213 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474a2884-6607-4971-9808-70ec4bc3796d-config\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048254 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97a30f73-c3c7-4e96-b68b-9d92fea63eb8-apiservice-cert\") pod \"packageserver-d55dfcdfc-r9jtp\" (UID: \"97a30f73-c3c7-4e96-b68b-9d92fea63eb8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048282 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/474a2884-6607-4971-9808-70ec4bc3796d-etcd-client\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048303 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/474a2884-6607-4971-9808-70ec4bc3796d-serving-cert\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048329 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/474a2884-6607-4971-9808-70ec4bc3796d-etcd-ca\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048359 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbvp\" (UniqueName: \"kubernetes.io/projected/474a2884-6607-4971-9808-70ec4bc3796d-kube-api-access-pwbvp\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048407 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a62ab634-c377-4878-808b-ebaaf2e87c8a-metrics-tls\") pod \"dns-operator-744455d44c-qh88t\" (UID: \"a62ab634-c377-4878-808b-ebaaf2e87c8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-qh88t" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048451 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/474a2884-6607-4971-9808-70ec4bc3796d-etcd-service-ca\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.048847 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97a30f73-c3c7-4e96-b68b-9d92fea63eb8-tmpfs\") pod \"packageserver-d55dfcdfc-r9jtp\" (UID: \"97a30f73-c3c7-4e96-b68b-9d92fea63eb8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.049264 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474a2884-6607-4971-9808-70ec4bc3796d-config\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.049420 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/474a2884-6607-4971-9808-70ec4bc3796d-etcd-service-ca\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.049594 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/474a2884-6607-4971-9808-70ec4bc3796d-etcd-ca\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.052611 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/474a2884-6607-4971-9808-70ec4bc3796d-etcd-client\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.054642 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a62ab634-c377-4878-808b-ebaaf2e87c8a-metrics-tls\") pod \"dns-operator-744455d44c-qh88t\" (UID: \"a62ab634-c377-4878-808b-ebaaf2e87c8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-qh88t" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.058253 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/474a2884-6607-4971-9808-70ec4bc3796d-serving-cert\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.063648 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.084624 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.103821 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.110098 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.110186 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.123165 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.143529 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.164712 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.181753 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.203172 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.243790 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.262176 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.283419 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.303233 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.322488 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.343166 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.363191 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.383224 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.401992 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.423830 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.442742 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.468044 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.482508 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.502784 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.522472 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.541928 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.563183 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.581717 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.602166 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.622652 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.632355 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97a30f73-c3c7-4e96-b68b-9d92fea63eb8-webhook-cert\") pod \"packageserver-d55dfcdfc-r9jtp\" (UID: \"97a30f73-c3c7-4e96-b68b-9d92fea63eb8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.632767 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97a30f73-c3c7-4e96-b68b-9d92fea63eb8-apiservice-cert\") pod \"packageserver-d55dfcdfc-r9jtp\" (UID: \"97a30f73-c3c7-4e96-b68b-9d92fea63eb8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.642607 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.662548 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.682842 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.702846 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.723429 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.742346 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.762397 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.782453 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.802577 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.823251 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.843929 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.862761 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.883478 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.904052 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.923018 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.941637 4708 request.go:700] Waited for 1.013576236s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.944330 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.962871 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 16:04:18 crc kubenswrapper[4708]: I0320 16:04:18.986281 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.003535 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.022753 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.043020 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.063554 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.083582 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.102567 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.123331 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.142972 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.162609 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.182900 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.204655 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.244257 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2bnl\" (UniqueName: \"kubernetes.io/projected/40d77bdf-8222-4072-bd4b-b766e73992cc-kube-api-access-v2bnl\") pod \"route-controller-manager-6576b87f9c-r28l7\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.245088 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.273825 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.282856 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.322371 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.336134 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbvqc\" (UniqueName: \"kubernetes.io/projected/a974d167-4fbf-4405-8acc-da1ab6e2f526-kube-api-access-bbvqc\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6c49\" (UID: \"a974d167-4fbf-4405-8acc-da1ab6e2f526\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.343472 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.363093 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.382713 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.403035 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.421968 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.442988 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.462867 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.481973 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.503221 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.522368 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.524859 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.527381 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.543895 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.563571 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.583762 4708 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.603590 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.625845 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.643559 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.696051 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dzlh\" (UniqueName: \"kubernetes.io/projected/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-kube-api-access-6dzlh\") pod \"console-f9d7485db-kdrms\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.697281 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thkp7\" (UniqueName: \"kubernetes.io/projected/716e1008-4ee5-42c3-9b4a-5c85a53489e0-kube-api-access-thkp7\") pod \"machine-api-operator-5694c8668f-q9z2q\" (UID: \"716e1008-4ee5-42c3-9b4a-5c85a53489e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.720055 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.720568 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzx75\" (UniqueName: \"kubernetes.io/projected/48933855-7a14-47ec-a83d-1787cb444869-kube-api-access-pzx75\") pod \"authentication-operator-69f744f599-r95lq\" (UID: \"48933855-7a14-47ec-a83d-1787cb444869\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.722911 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.751432 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7"] Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.760937 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9rkc\" (UniqueName: \"kubernetes.io/projected/fe734cb0-fdf7-45a6-9a68-a12457600931-kube-api-access-t9rkc\") pod \"openshift-apiserver-operator-796bbdcf4f-ftsws\" (UID: \"fe734cb0-fdf7-45a6-9a68-a12457600931\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" Mar 20 16:04:19 crc kubenswrapper[4708]: W0320 16:04:19.762179 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40d77bdf_8222_4072_bd4b_b766e73992cc.slice/crio-b45798254d0a5a2f65a59a76d8589991ef97b8df32909d506d67468ea328edde WatchSource:0}: Error finding container b45798254d0a5a2f65a59a76d8589991ef97b8df32909d506d67468ea328edde: Status 404 returned error can't find the container with id b45798254d0a5a2f65a59a76d8589991ef97b8df32909d506d67468ea328edde Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.778517 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49"] Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.780861 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvw5r\" (UniqueName: \"kubernetes.io/projected/0619cccd-a201-4a73-9d07-01ccc4eb7c84-kube-api-access-vvw5r\") pod \"controller-manager-879f6c89f-j74qm\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.798470 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nz86\" (UniqueName: \"kubernetes.io/projected/55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3-kube-api-access-6nz86\") pod \"apiserver-7bbb656c7d-pz6p4\" (UID: \"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.810024 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.821553 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj27f\" (UniqueName: \"kubernetes.io/projected/4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb-kube-api-access-jj27f\") pod \"console-operator-58897d9998-rsr2h\" (UID: \"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb\") " pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.830895 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.840526 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbp56\" (UniqueName: \"kubernetes.io/projected/0a5fa365-0110-4a06-b2d7-cfd9b5745603-kube-api-access-fbp56\") pod \"downloads-7954f5f757-2bgzp\" (UID: \"0a5fa365-0110-4a06-b2d7-cfd9b5745603\") " pod="openshift-console/downloads-7954f5f757-2bgzp" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.843474 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.859663 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdj75\" (UniqueName: \"kubernetes.io/projected/93dc952e-1f6a-4e6f-b3ce-3665b4129805-kube-api-access-sdj75\") pod \"openshift-config-operator-7777fb866f-gd2q5\" (UID: \"93dc952e-1f6a-4e6f-b3ce-3665b4129805\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.861003 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.879092 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq8ww\" (UniqueName: \"kubernetes.io/projected/50baa880-1d72-48c6-b370-2f0094a30f23-kube-api-access-pq8ww\") pod \"apiserver-76f77b778f-rz6l8\" (UID: \"50baa880-1d72-48c6-b370-2f0094a30f23\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.882845 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.904223 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.925890 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.927585 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-q9z2q"] Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.943177 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.960903 4708 request.go:700] Waited for 1.957054934s due to client-side throttling, not priority and fairness, request: PATCH:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-6576b87f9c-r28l7/status Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.982902 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.996206 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" event={"ID":"716e1008-4ee5-42c3-9b4a-5c85a53489e0","Type":"ContainerStarted","Data":"113433f0ee63c203767fcd739f2932e26a7874df47f1083d7434922e75ad2a79"} Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.998501 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" event={"ID":"a974d167-4fbf-4405-8acc-da1ab6e2f526","Type":"ContainerStarted","Data":"e5d3e2cacb7686aefd7325b2aba9172b18e466561a52e5b22ffb12a533b56e97"} Mar 20 16:04:19 crc kubenswrapper[4708]: I0320 16:04:19.998529 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" event={"ID":"a974d167-4fbf-4405-8acc-da1ab6e2f526","Type":"ContainerStarted","Data":"c1591b4c690fb6b03a2bcabcb545164d8b6a81fd5af5eea78a7fcabb4392f5d6"} Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.003806 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" event={"ID":"40d77bdf-8222-4072-bd4b-b766e73992cc","Type":"ContainerStarted","Data":"48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0"} Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.003866 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" event={"ID":"40d77bdf-8222-4072-bd4b-b766e73992cc","Type":"ContainerStarted","Data":"b45798254d0a5a2f65a59a76d8589991ef97b8df32909d506d67468ea328edde"} Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.004445 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.004806 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.008197 4708 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-r28l7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.008244 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" podUID="40d77bdf-8222-4072-bd4b-b766e73992cc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.023045 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.034265 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.064371 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lc27\" (UniqueName: \"kubernetes.io/projected/97a30f73-c3c7-4e96-b68b-9d92fea63eb8-kube-api-access-6lc27\") pod \"packageserver-d55dfcdfc-r9jtp\" (UID: \"97a30f73-c3c7-4e96-b68b-9d92fea63eb8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.084331 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.088363 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzgjh\" (UniqueName: \"kubernetes.io/projected/a62ab634-c377-4878-808b-ebaaf2e87c8a-kube-api-access-mzgjh\") pod \"dns-operator-744455d44c-qh88t\" (UID: \"a62ab634-c377-4878-808b-ebaaf2e87c8a\") " pod="openshift-dns-operator/dns-operator-744455d44c-qh88t" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.091911 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.100243 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwbvp\" (UniqueName: \"kubernetes.io/projected/474a2884-6607-4971-9808-70ec4bc3796d-kube-api-access-pwbvp\") pod \"etcd-operator-b45778765-762wg\" (UID: \"474a2884-6607-4971-9808-70ec4bc3796d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.104921 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kdrms"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.107955 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.109801 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.115099 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2bgzp" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.124298 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.153230 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qh88t" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.176758 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177254 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5cxc\" (UniqueName: \"kubernetes.io/projected/d4a86699-e0df-47a7-a7d6-50ad108ffaae-kube-api-access-k5cxc\") pod \"cluster-samples-operator-665b6dd947-mlg6n\" (UID: \"d4a86699-e0df-47a7-a7d6-50ad108ffaae\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177314 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c5558439-83ed-4f0d-ae59-59b79ac23667-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8bbz9\" (UID: \"c5558439-83ed-4f0d-ae59-59b79ac23667\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177369 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbdadb48-e471-474c-8eff-d3acbd3b5ced-config\") pod \"kube-apiserver-operator-766d6c64bb-2h7nt\" (UID: \"bbdadb48-e471-474c-8eff-d3acbd3b5ced\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177390 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06c196bd-6e44-4148-8fd9-a0b9e13c09f9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vnqpk\" (UID: \"06c196bd-6e44-4148-8fd9-a0b9e13c09f9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177410 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/722ff406-045d-48b6-a329-df2851889a3a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k2mxn\" (UID: \"722ff406-045d-48b6-a329-df2851889a3a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177426 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/761e5144-aa8a-4203-b166-b5dc638bfe79-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177468 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722ff406-045d-48b6-a329-df2851889a3a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k2mxn\" (UID: \"722ff406-045d-48b6-a329-df2851889a3a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177503 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdadb48-e471-474c-8eff-d3acbd3b5ced-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2h7nt\" (UID: \"bbdadb48-e471-474c-8eff-d3acbd3b5ced\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177525 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a22db040-c541-4ade-8099-899f3581d6c6-service-ca-bundle\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177608 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177648 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lnt5\" (UniqueName: \"kubernetes.io/projected/fa1343fa-debb-4834-a679-e82cec21dfda-kube-api-access-7lnt5\") pod \"machine-approver-56656f9798-46dcz\" (UID: \"fa1343fa-debb-4834-a679-e82cec21dfda\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177691 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68a687d9-a448-4a5c-b7b9-e4510468b3c9-audit-dir\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177730 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f0d60b9-4e7c-4ee8-a7f9-93f217c81603-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xfcd6\" (UID: \"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177765 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a22db040-c541-4ade-8099-899f3581d6c6-metrics-certs\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177788 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177827 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f3f468de-b540-426b-8b2f-303230c91fd3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hrsxq\" (UID: \"f3f468de-b540-426b-8b2f-303230c91fd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177850 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fa1343fa-debb-4834-a679-e82cec21dfda-machine-approver-tls\") pod \"machine-approver-56656f9798-46dcz\" (UID: \"fa1343fa-debb-4834-a679-e82cec21dfda\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177887 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177915 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722ff406-045d-48b6-a329-df2851889a3a-config\") pod \"kube-controller-manager-operator-78b949d7b-k2mxn\" (UID: \"722ff406-045d-48b6-a329-df2851889a3a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177935 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f0d60b9-4e7c-4ee8-a7f9-93f217c81603-trusted-ca\") pod \"ingress-operator-5b745b69d9-xfcd6\" (UID: \"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177954 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-registry-tls\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177977 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.177999 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178060 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-848jx\" (UniqueName: \"kubernetes.io/projected/60382231-11cf-4076-b6fa-2e6277ab675f-kube-api-access-848jx\") pod \"migrator-59844c95c7-7hgxt\" (UID: \"60382231-11cf-4076-b6fa-2e6277ab675f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hgxt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178095 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjh6x\" (UniqueName: \"kubernetes.io/projected/f3f468de-b540-426b-8b2f-303230c91fd3-kube-api-access-cjh6x\") pod \"cluster-image-registry-operator-dc59b4c8b-hrsxq\" (UID: \"f3f468de-b540-426b-8b2f-303230c91fd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178132 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkqpp\" (UniqueName: \"kubernetes.io/projected/c5558439-83ed-4f0d-ae59-59b79ac23667-kube-api-access-bkqpp\") pod \"olm-operator-6b444d44fb-8bbz9\" (UID: \"c5558439-83ed-4f0d-ae59-59b79ac23667\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178188 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/acda3608-876c-4788-8310-481241fc9fd5-signing-key\") pod \"service-ca-9c57cc56f-5mvhn\" (UID: \"acda3608-876c-4788-8310-481241fc9fd5\") " pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178265 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xvmh\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-kube-api-access-6xvmh\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178302 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178325 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178351 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3f468de-b540-426b-8b2f-303230c91fd3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hrsxq\" (UID: \"f3f468de-b540-426b-8b2f-303230c91fd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178373 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/761e5144-aa8a-4203-b166-b5dc638bfe79-registry-certificates\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178425 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4a86699-e0df-47a7-a7d6-50ad108ffaae-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mlg6n\" (UID: \"d4a86699-e0df-47a7-a7d6-50ad108ffaae\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178450 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178491 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa1343fa-debb-4834-a679-e82cec21dfda-config\") pod \"machine-approver-56656f9798-46dcz\" (UID: \"fa1343fa-debb-4834-a679-e82cec21dfda\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178517 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jscz\" (UniqueName: \"kubernetes.io/projected/a22db040-c541-4ade-8099-899f3581d6c6-kube-api-access-4jscz\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178572 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqrlp\" (UniqueName: \"kubernetes.io/projected/acda3608-876c-4788-8310-481241fc9fd5-kube-api-access-cqrlp\") pod \"service-ca-9c57cc56f-5mvhn\" (UID: \"acda3608-876c-4788-8310-481241fc9fd5\") " pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178597 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dc14e84a-f5ca-4b5f-871e-059ea6092ad3-srv-cert\") pod \"catalog-operator-68c6474976-dgdfs\" (UID: \"dc14e84a-f5ca-4b5f-871e-059ea6092ad3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178618 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a22db040-c541-4ade-8099-899f3581d6c6-default-certificate\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.178789 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.179004 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/acda3608-876c-4788-8310-481241fc9fd5-signing-cabundle\") pod \"service-ca-9c57cc56f-5mvhn\" (UID: \"acda3608-876c-4788-8310-481241fc9fd5\") " pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.179036 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-bound-sa-token\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.179061 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cqrs\" (UniqueName: \"kubernetes.io/projected/2f0d60b9-4e7c-4ee8-a7f9-93f217c81603-kube-api-access-9cqrs\") pod \"ingress-operator-5b745b69d9-xfcd6\" (UID: \"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.179098 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dc14e84a-f5ca-4b5f-871e-059ea6092ad3-profile-collector-cert\") pod \"catalog-operator-68c6474976-dgdfs\" (UID: \"dc14e84a-f5ca-4b5f-871e-059ea6092ad3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.179159 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f0d60b9-4e7c-4ee8-a7f9-93f217c81603-metrics-tls\") pod \"ingress-operator-5b745b69d9-xfcd6\" (UID: \"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.187547 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06c196bd-6e44-4148-8fd9-a0b9e13c09f9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vnqpk\" (UID: \"06c196bd-6e44-4148-8fd9-a0b9e13c09f9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.187799 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.187924 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06c196bd-6e44-4148-8fd9-a0b9e13c09f9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vnqpk\" (UID: \"06c196bd-6e44-4148-8fd9-a0b9e13c09f9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.188425 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c5558439-83ed-4f0d-ae59-59b79ac23667-srv-cert\") pod \"olm-operator-6b444d44fb-8bbz9\" (UID: \"c5558439-83ed-4f0d-ae59-59b79ac23667\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.188524 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/761e5144-aa8a-4203-b166-b5dc638bfe79-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.188897 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.189012 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fa1343fa-debb-4834-a679-e82cec21dfda-auth-proxy-config\") pod \"machine-approver-56656f9798-46dcz\" (UID: \"fa1343fa-debb-4834-a679-e82cec21dfda\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.189210 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-audit-policies\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.189274 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbdadb48-e471-474c-8eff-d3acbd3b5ced-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2h7nt\" (UID: \"bbdadb48-e471-474c-8eff-d3acbd3b5ced\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.189430 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a22db040-c541-4ade-8099-899f3581d6c6-stats-auth\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.189522 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/761e5144-aa8a-4203-b166-b5dc638bfe79-trusted-ca\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.189588 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqmzq\" (UniqueName: \"kubernetes.io/projected/dc14e84a-f5ca-4b5f-871e-059ea6092ad3-kube-api-access-gqmzq\") pod \"catalog-operator-68c6474976-dgdfs\" (UID: \"dc14e84a-f5ca-4b5f-871e-059ea6092ad3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.189811 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkpnj\" (UniqueName: \"kubernetes.io/projected/68a687d9-a448-4a5c-b7b9-e4510468b3c9-kube-api-access-gkpnj\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.189857 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3f468de-b540-426b-8b2f-303230c91fd3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hrsxq\" (UID: \"f3f468de-b540-426b-8b2f-303230c91fd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.193509 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: E0320 16:04:20.194623 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:20.694603889 +0000 UTC m=+215.368940604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.240878 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.294187 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.294946 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:20 crc kubenswrapper[4708]: E0320 16:04:20.295188 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:20.795154819 +0000 UTC m=+215.469491534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295229 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbjck\" (UniqueName: \"kubernetes.io/projected/c7bf1b0e-5792-415a-8c81-2dafe6019fac-kube-api-access-pbjck\") pod \"dns-default-qcjhm\" (UID: \"c7bf1b0e-5792-415a-8c81-2dafe6019fac\") " pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295292 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a22db040-c541-4ade-8099-899f3581d6c6-stats-auth\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295323 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/844b5b79-0eb3-4e21-8754-8b10a667f6d0-node-bootstrap-token\") pod \"machine-config-server-248bf\" (UID: \"844b5b79-0eb3-4e21-8754-8b10a667f6d0\") " pod="openshift-machine-config-operator/machine-config-server-248bf" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295347 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/761e5144-aa8a-4203-b166-b5dc638bfe79-trusted-ca\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295390 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqmzq\" (UniqueName: \"kubernetes.io/projected/dc14e84a-f5ca-4b5f-871e-059ea6092ad3-kube-api-access-gqmzq\") pod \"catalog-operator-68c6474976-dgdfs\" (UID: \"dc14e84a-f5ca-4b5f-871e-059ea6092ad3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295426 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3e5293a-140f-4740-adba-04f550109f8e-proxy-tls\") pod \"machine-config-controller-84d6567774-mpv8d\" (UID: \"f3e5293a-140f-4740-adba-04f550109f8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295451 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3f468de-b540-426b-8b2f-303230c91fd3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hrsxq\" (UID: \"f3f468de-b540-426b-8b2f-303230c91fd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295494 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkpnj\" (UniqueName: \"kubernetes.io/projected/68a687d9-a448-4a5c-b7b9-e4510468b3c9-kube-api-access-gkpnj\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295519 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jx7z\" (UniqueName: \"kubernetes.io/projected/aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6-kube-api-access-7jx7z\") pod \"control-plane-machine-set-operator-78cbb6b69f-qd9xk\" (UID: \"aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295554 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295594 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5cxc\" (UniqueName: \"kubernetes.io/projected/d4a86699-e0df-47a7-a7d6-50ad108ffaae-kube-api-access-k5cxc\") pod \"cluster-samples-operator-665b6dd947-mlg6n\" (UID: \"d4a86699-e0df-47a7-a7d6-50ad108ffaae\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295622 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6183b8c1-817c-4778-8310-b0481ebcc004-config\") pod \"service-ca-operator-777779d784-sr2kb\" (UID: \"6183b8c1-817c-4778-8310-b0481ebcc004\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295649 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c5558439-83ed-4f0d-ae59-59b79ac23667-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8bbz9\" (UID: \"c5558439-83ed-4f0d-ae59-59b79ac23667\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295700 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sstrq\" (UniqueName: \"kubernetes.io/projected/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-kube-api-access-sstrq\") pod \"collect-profiles-29567040-v2h7s\" (UID: \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295743 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2zw7\" (UniqueName: \"kubernetes.io/projected/69e3d1a5-5541-46de-ae32-4f5005a7a6c6-kube-api-access-c2zw7\") pod \"multus-admission-controller-857f4d67dd-4hmnl\" (UID: \"69e3d1a5-5541-46de-ae32-4f5005a7a6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4hmnl" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295769 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06c196bd-6e44-4148-8fd9-a0b9e13c09f9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vnqpk\" (UID: \"06c196bd-6e44-4148-8fd9-a0b9e13c09f9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295793 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/722ff406-045d-48b6-a329-df2851889a3a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k2mxn\" (UID: \"722ff406-045d-48b6-a329-df2851889a3a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295818 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbdadb48-e471-474c-8eff-d3acbd3b5ced-config\") pod \"kube-apiserver-operator-766d6c64bb-2h7nt\" (UID: \"bbdadb48-e471-474c-8eff-d3acbd3b5ced\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.295919 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/761e5144-aa8a-4203-b166-b5dc638bfe79-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: E0320 16:04:20.296325 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:20.796305242 +0000 UTC m=+215.470642117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.296886 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-plugins-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.296955 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722ff406-045d-48b6-a329-df2851889a3a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k2mxn\" (UID: \"722ff406-045d-48b6-a329-df2851889a3a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297032 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdadb48-e471-474c-8eff-d3acbd3b5ced-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2h7nt\" (UID: \"bbdadb48-e471-474c-8eff-d3acbd3b5ced\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297064 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a22db040-c541-4ade-8099-899f3581d6c6-service-ca-bundle\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297119 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc77181-1e16-4106-8433-da8e839d8275-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-696kc\" (UID: \"dbc77181-1e16-4106-8433-da8e839d8275\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297140 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f89383d-9d8a-4355-95d2-79081dda4a71-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-trjvs\" (UID: \"9f89383d-9d8a-4355-95d2-79081dda4a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297327 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/761e5144-aa8a-4203-b166-b5dc638bfe79-trusted-ca\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297516 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kb6s\" (UniqueName: \"kubernetes.io/projected/c8619d97-0550-44ab-b54a-9bb0c275d6d0-kube-api-access-4kb6s\") pod \"ingress-canary-p97v5\" (UID: \"c8619d97-0550-44ab-b54a-9bb0c275d6d0\") " pod="openshift-ingress-canary/ingress-canary-p97v5" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297563 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297593 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-socket-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297644 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lnt5\" (UniqueName: \"kubernetes.io/projected/fa1343fa-debb-4834-a679-e82cec21dfda-kube-api-access-7lnt5\") pod \"machine-approver-56656f9798-46dcz\" (UID: \"fa1343fa-debb-4834-a679-e82cec21dfda\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297737 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-csi-data-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297771 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2c9b8a5-ad39-4793-b85e-6282872b25f6-proxy-tls\") pod \"machine-config-operator-74547568cd-jfntr\" (UID: \"c2c9b8a5-ad39-4793-b85e-6282872b25f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297854 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68a687d9-a448-4a5c-b7b9-e4510468b3c9-audit-dir\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.297888 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f0d60b9-4e7c-4ee8-a7f9-93f217c81603-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xfcd6\" (UID: \"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.298104 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbdadb48-e471-474c-8eff-d3acbd3b5ced-config\") pod \"kube-apiserver-operator-766d6c64bb-2h7nt\" (UID: \"bbdadb48-e471-474c-8eff-d3acbd3b5ced\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.298351 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68a687d9-a448-4a5c-b7b9-e4510468b3c9-audit-dir\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.298480 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a22db040-c541-4ade-8099-899f3581d6c6-metrics-certs\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.298529 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzhm8\" (UniqueName: \"kubernetes.io/projected/c2c9b8a5-ad39-4793-b85e-6282872b25f6-kube-api-access-mzhm8\") pod \"machine-config-operator-74547568cd-jfntr\" (UID: \"c2c9b8a5-ad39-4793-b85e-6282872b25f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.298558 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4437a79c-97de-40ff-a2fa-d29cc2f86828-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6h42j\" (UID: \"4437a79c-97de-40ff-a2fa-d29cc2f86828\") " pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.298583 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8swt\" (UniqueName: \"kubernetes.io/projected/4437a79c-97de-40ff-a2fa-d29cc2f86828-kube-api-access-w8swt\") pod \"marketplace-operator-79b997595-6h42j\" (UID: \"4437a79c-97de-40ff-a2fa-d29cc2f86828\") " pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.298643 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.298991 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a22db040-c541-4ade-8099-899f3581d6c6-service-ca-bundle\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.299978 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/69e3d1a5-5541-46de-ae32-4f5005a7a6c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4hmnl\" (UID: \"69e3d1a5-5541-46de-ae32-4f5005a7a6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4hmnl" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.300009 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4437a79c-97de-40ff-a2fa-d29cc2f86828-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6h42j\" (UID: \"4437a79c-97de-40ff-a2fa-d29cc2f86828\") " pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.300740 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f3f468de-b540-426b-8b2f-303230c91fd3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hrsxq\" (UID: \"f3f468de-b540-426b-8b2f-303230c91fd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.300785 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c2c9b8a5-ad39-4793-b85e-6282872b25f6-images\") pod \"machine-config-operator-74547568cd-jfntr\" (UID: \"c2c9b8a5-ad39-4793-b85e-6282872b25f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.300883 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.300926 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722ff406-045d-48b6-a329-df2851889a3a-config\") pod \"kube-controller-manager-operator-78b949d7b-k2mxn\" (UID: \"722ff406-045d-48b6-a329-df2851889a3a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.300944 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f0d60b9-4e7c-4ee8-a7f9-93f217c81603-trusted-ca\") pod \"ingress-operator-5b745b69d9-xfcd6\" (UID: \"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.300961 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fa1343fa-debb-4834-a679-e82cec21dfda-machine-approver-tls\") pod \"machine-approver-56656f9798-46dcz\" (UID: \"fa1343fa-debb-4834-a679-e82cec21dfda\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.301018 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.307151 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/761e5144-aa8a-4203-b166-b5dc638bfe79-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.307302 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c5558439-83ed-4f0d-ae59-59b79ac23667-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8bbz9\" (UID: \"c5558439-83ed-4f0d-ae59-59b79ac23667\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.307298 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-registry-tls\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.307385 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.307410 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-848jx\" (UniqueName: \"kubernetes.io/projected/60382231-11cf-4076-b6fa-2e6277ab675f-kube-api-access-848jx\") pod \"migrator-59844c95c7-7hgxt\" (UID: \"60382231-11cf-4076-b6fa-2e6277ab675f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hgxt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.308707 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722ff406-045d-48b6-a329-df2851889a3a-config\") pod \"kube-controller-manager-operator-78b949d7b-k2mxn\" (UID: \"722ff406-045d-48b6-a329-df2851889a3a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.309037 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.309288 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.309874 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.310154 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.314195 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f0d60b9-4e7c-4ee8-a7f9-93f217c81603-trusted-ca\") pod \"ingress-operator-5b745b69d9-xfcd6\" (UID: \"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.314398 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjh6x\" (UniqueName: \"kubernetes.io/projected/f3f468de-b540-426b-8b2f-303230c91fd3-kube-api-access-cjh6x\") pod \"cluster-image-registry-operator-dc59b4c8b-hrsxq\" (UID: \"f3f468de-b540-426b-8b2f-303230c91fd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.314763 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hx92\" (UniqueName: \"kubernetes.io/projected/69fd6ec4-db77-4309-977a-7e80359aec50-kube-api-access-9hx92\") pod \"auto-csr-approver-29567044-8vwl5\" (UID: \"69fd6ec4-db77-4309-977a-7e80359aec50\") " pod="openshift-infra/auto-csr-approver-29567044-8vwl5" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.315286 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkqpp\" (UniqueName: \"kubernetes.io/projected/c5558439-83ed-4f0d-ae59-59b79ac23667-kube-api-access-bkqpp\") pod \"olm-operator-6b444d44fb-8bbz9\" (UID: \"c5558439-83ed-4f0d-ae59-59b79ac23667\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.316130 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-mountpoint-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.316284 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mbcc\" (UniqueName: \"kubernetes.io/projected/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-kube-api-access-5mbcc\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.316369 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8619d97-0550-44ab-b54a-9bb0c275d6d0-cert\") pod \"ingress-canary-p97v5\" (UID: \"c8619d97-0550-44ab-b54a-9bb0c275d6d0\") " pod="openshift-ingress-canary/ingress-canary-p97v5" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.316416 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89g5h\" (UniqueName: \"kubernetes.io/projected/6183b8c1-817c-4778-8310-b0481ebcc004-kube-api-access-89g5h\") pod \"service-ca-operator-777779d784-sr2kb\" (UID: \"6183b8c1-817c-4778-8310-b0481ebcc004\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.316494 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/844b5b79-0eb3-4e21-8754-8b10a667f6d0-certs\") pod \"machine-config-server-248bf\" (UID: \"844b5b79-0eb3-4e21-8754-8b10a667f6d0\") " pod="openshift-machine-config-operator/machine-config-server-248bf" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.316514 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7bf1b0e-5792-415a-8c81-2dafe6019fac-config-volume\") pod \"dns-default-qcjhm\" (UID: \"c7bf1b0e-5792-415a-8c81-2dafe6019fac\") " pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.316571 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/acda3608-876c-4788-8310-481241fc9fd5-signing-key\") pod \"service-ca-9c57cc56f-5mvhn\" (UID: \"acda3608-876c-4788-8310-481241fc9fd5\") " pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.316908 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xvmh\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-kube-api-access-6xvmh\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.317134 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.317560 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.317603 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3f468de-b540-426b-8b2f-303230c91fd3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hrsxq\" (UID: \"f3f468de-b540-426b-8b2f-303230c91fd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.318342 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/761e5144-aa8a-4203-b166-b5dc638bfe79-registry-certificates\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.319101 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3f468de-b540-426b-8b2f-303230c91fd3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hrsxq\" (UID: \"f3f468de-b540-426b-8b2f-303230c91fd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.320165 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/761e5144-aa8a-4203-b166-b5dc638bfe79-registry-certificates\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.320795 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a22db040-c541-4ade-8099-899f3581d6c6-metrics-certs\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.318436 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4a86699-e0df-47a7-a7d6-50ad108ffaae-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mlg6n\" (UID: \"d4a86699-e0df-47a7-a7d6-50ad108ffaae\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.324908 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.325304 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fa1343fa-debb-4834-a679-e82cec21dfda-machine-approver-tls\") pod \"machine-approver-56656f9798-46dcz\" (UID: \"fa1343fa-debb-4834-a679-e82cec21dfda\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.325754 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.325864 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-registry-tls\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.325944 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4a86699-e0df-47a7-a7d6-50ad108ffaae-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mlg6n\" (UID: \"d4a86699-e0df-47a7-a7d6-50ad108ffaae\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.326079 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722ff406-045d-48b6-a329-df2851889a3a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k2mxn\" (UID: \"722ff406-045d-48b6-a329-df2851889a3a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.326255 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a22db040-c541-4ade-8099-899f3581d6c6-stats-auth\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.326303 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/acda3608-876c-4788-8310-481241fc9fd5-signing-key\") pod \"service-ca-9c57cc56f-5mvhn\" (UID: \"acda3608-876c-4788-8310-481241fc9fd5\") " pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.326967 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3f468de-b540-426b-8b2f-303230c91fd3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hrsxq\" (UID: \"f3f468de-b540-426b-8b2f-303230c91fd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.327502 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.327819 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c2c9b8a5-ad39-4793-b85e-6282872b25f6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jfntr\" (UID: \"c2c9b8a5-ad39-4793-b85e-6282872b25f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.327920 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.327945 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5h2b\" (UniqueName: \"kubernetes.io/projected/844b5b79-0eb3-4e21-8754-8b10a667f6d0-kube-api-access-w5h2b\") pod \"machine-config-server-248bf\" (UID: \"844b5b79-0eb3-4e21-8754-8b10a667f6d0\") " pod="openshift-machine-config-operator/machine-config-server-248bf" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.329215 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa1343fa-debb-4834-a679-e82cec21dfda-config\") pod \"machine-approver-56656f9798-46dcz\" (UID: \"fa1343fa-debb-4834-a679-e82cec21dfda\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.329166 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.329301 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jscz\" (UniqueName: \"kubernetes.io/projected/a22db040-c541-4ade-8099-899f3581d6c6-kube-api-access-4jscz\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.330364 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa1343fa-debb-4834-a679-e82cec21dfda-config\") pod \"machine-approver-56656f9798-46dcz\" (UID: \"fa1343fa-debb-4834-a679-e82cec21dfda\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.329332 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3e5293a-140f-4740-adba-04f550109f8e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mpv8d\" (UID: \"f3e5293a-140f-4740-adba-04f550109f8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.337397 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqrlp\" (UniqueName: \"kubernetes.io/projected/acda3608-876c-4788-8310-481241fc9fd5-kube-api-access-cqrlp\") pod \"service-ca-9c57cc56f-5mvhn\" (UID: \"acda3608-876c-4788-8310-481241fc9fd5\") " pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.337531 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws"] Mar 20 16:04:20 crc kubenswrapper[4708]: W0320 16:04:20.341945 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b538ad_b2b1_4b7d_87e1_5ea5038ac7f3.slice/crio-441e82f1f67c60df82dbbb1e993aae8d4360d23a006dd8667c001972779f2f8f WatchSource:0}: Error finding container 441e82f1f67c60df82dbbb1e993aae8d4360d23a006dd8667c001972779f2f8f: Status 404 returned error can't find the container with id 441e82f1f67c60df82dbbb1e993aae8d4360d23a006dd8667c001972779f2f8f Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.352170 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dc14e84a-f5ca-4b5f-871e-059ea6092ad3-srv-cert\") pod \"catalog-operator-68c6474976-dgdfs\" (UID: \"dc14e84a-f5ca-4b5f-871e-059ea6092ad3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.360837 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqmzq\" (UniqueName: \"kubernetes.io/projected/dc14e84a-f5ca-4b5f-871e-059ea6092ad3-kube-api-access-gqmzq\") pod \"catalog-operator-68c6474976-dgdfs\" (UID: \"dc14e84a-f5ca-4b5f-871e-059ea6092ad3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.365613 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dc14e84a-f5ca-4b5f-871e-059ea6092ad3-srv-cert\") pod \"catalog-operator-68c6474976-dgdfs\" (UID: \"dc14e84a-f5ca-4b5f-871e-059ea6092ad3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.367881 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j74qm"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.375958 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-r95lq"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.371210 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06c196bd-6e44-4148-8fd9-a0b9e13c09f9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vnqpk\" (UID: \"06c196bd-6e44-4148-8fd9-a0b9e13c09f9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.376062 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a22db040-c541-4ade-8099-899f3581d6c6-default-certificate\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.376102 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4n82\" (UniqueName: \"kubernetes.io/projected/dbc77181-1e16-4106-8433-da8e839d8275-kube-api-access-b4n82\") pod \"package-server-manager-789f6589d5-696kc\" (UID: \"dbc77181-1e16-4106-8433-da8e839d8275\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.376203 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-secret-volume\") pod \"collect-profiles-29567040-v2h7s\" (UID: \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.376268 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.376293 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-registration-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.376320 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s87tn\" (UniqueName: \"kubernetes.io/projected/f3e5293a-140f-4740-adba-04f550109f8e-kube-api-access-s87tn\") pod \"machine-config-controller-84d6567774-mpv8d\" (UID: \"f3e5293a-140f-4740-adba-04f550109f8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.376371 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/acda3608-876c-4788-8310-481241fc9fd5-signing-cabundle\") pod \"service-ca-9c57cc56f-5mvhn\" (UID: \"acda3608-876c-4788-8310-481241fc9fd5\") " pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.376394 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f89383d-9d8a-4355-95d2-79081dda4a71-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-trjvs\" (UID: \"9f89383d-9d8a-4355-95d2-79081dda4a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.376424 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-config-volume\") pod \"collect-profiles-29567040-v2h7s\" (UID: \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.378157 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/acda3608-876c-4788-8310-481241fc9fd5-signing-cabundle\") pod \"service-ca-9c57cc56f-5mvhn\" (UID: \"acda3608-876c-4788-8310-481241fc9fd5\") " pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.384902 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.386106 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/722ff406-045d-48b6-a329-df2851889a3a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k2mxn\" (UID: \"722ff406-045d-48b6-a329-df2851889a3a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.376455 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-bound-sa-token\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.387435 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cqrs\" (UniqueName: \"kubernetes.io/projected/2f0d60b9-4e7c-4ee8-a7f9-93f217c81603-kube-api-access-9cqrs\") pod \"ingress-operator-5b745b69d9-xfcd6\" (UID: \"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.387487 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbnk5\" (UniqueName: \"kubernetes.io/projected/9f89383d-9d8a-4355-95d2-79081dda4a71-kube-api-access-hbnk5\") pod \"kube-storage-version-migrator-operator-b67b599dd-trjvs\" (UID: \"9f89383d-9d8a-4355-95d2-79081dda4a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.390204 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dc14e84a-f5ca-4b5f-871e-059ea6092ad3-profile-collector-cert\") pod \"catalog-operator-68c6474976-dgdfs\" (UID: \"dc14e84a-f5ca-4b5f-871e-059ea6092ad3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.390265 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7bf1b0e-5792-415a-8c81-2dafe6019fac-metrics-tls\") pod \"dns-default-qcjhm\" (UID: \"c7bf1b0e-5792-415a-8c81-2dafe6019fac\") " pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.390311 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06c196bd-6e44-4148-8fd9-a0b9e13c09f9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vnqpk\" (UID: \"06c196bd-6e44-4148-8fd9-a0b9e13c09f9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.390363 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f0d60b9-4e7c-4ee8-a7f9-93f217c81603-metrics-tls\") pod \"ingress-operator-5b745b69d9-xfcd6\" (UID: \"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.390396 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.390465 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06c196bd-6e44-4148-8fd9-a0b9e13c09f9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vnqpk\" (UID: \"06c196bd-6e44-4148-8fd9-a0b9e13c09f9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.390511 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c5558439-83ed-4f0d-ae59-59b79ac23667-srv-cert\") pod \"olm-operator-6b444d44fb-8bbz9\" (UID: \"c5558439-83ed-4f0d-ae59-59b79ac23667\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.390538 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/761e5144-aa8a-4203-b166-b5dc638bfe79-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.390612 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.390649 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fa1343fa-debb-4834-a679-e82cec21dfda-auth-proxy-config\") pod \"machine-approver-56656f9798-46dcz\" (UID: \"fa1343fa-debb-4834-a679-e82cec21dfda\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.390816 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qd9xk\" (UID: \"aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.391183 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6183b8c1-817c-4778-8310-b0481ebcc004-serving-cert\") pod \"service-ca-operator-777779d784-sr2kb\" (UID: \"6183b8c1-817c-4778-8310-b0481ebcc004\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.391264 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-audit-policies\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.391298 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbdadb48-e471-474c-8eff-d3acbd3b5ced-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2h7nt\" (UID: \"bbdadb48-e471-474c-8eff-d3acbd3b5ced\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.393918 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/761e5144-aa8a-4203-b166-b5dc638bfe79-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.394568 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06c196bd-6e44-4148-8fd9-a0b9e13c09f9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vnqpk\" (UID: \"06c196bd-6e44-4148-8fd9-a0b9e13c09f9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.395382 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-audit-policies\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.398269 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a22db040-c541-4ade-8099-899f3581d6c6-default-certificate\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.399106 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rz6l8"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.399945 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fa1343fa-debb-4834-a679-e82cec21dfda-auth-proxy-config\") pod \"machine-approver-56656f9798-46dcz\" (UID: \"fa1343fa-debb-4834-a679-e82cec21dfda\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.406738 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rsr2h"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.409068 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06c196bd-6e44-4148-8fd9-a0b9e13c09f9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vnqpk\" (UID: \"06c196bd-6e44-4148-8fd9-a0b9e13c09f9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.410905 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dc14e84a-f5ca-4b5f-871e-059ea6092ad3-profile-collector-cert\") pod \"catalog-operator-68c6474976-dgdfs\" (UID: \"dc14e84a-f5ca-4b5f-871e-059ea6092ad3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.417771 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f0d60b9-4e7c-4ee8-a7f9-93f217c81603-metrics-tls\") pod \"ingress-operator-5b745b69d9-xfcd6\" (UID: \"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.418187 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbdadb48-e471-474c-8eff-d3acbd3b5ced-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2h7nt\" (UID: \"bbdadb48-e471-474c-8eff-d3acbd3b5ced\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.420947 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c5558439-83ed-4f0d-ae59-59b79ac23667-srv-cert\") pod \"olm-operator-6b444d44fb-8bbz9\" (UID: \"c5558439-83ed-4f0d-ae59-59b79ac23667\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.421428 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.421766 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.422149 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkpnj\" (UniqueName: \"kubernetes.io/projected/68a687d9-a448-4a5c-b7b9-e4510468b3c9-kube-api-access-gkpnj\") pod \"oauth-openshift-558db77b4-hpk69\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.425088 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5cxc\" (UniqueName: \"kubernetes.io/projected/d4a86699-e0df-47a7-a7d6-50ad108ffaae-kube-api-access-k5cxc\") pod \"cluster-samples-operator-665b6dd947-mlg6n\" (UID: \"d4a86699-e0df-47a7-a7d6-50ad108ffaae\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.440135 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.448596 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdadb48-e471-474c-8eff-d3acbd3b5ced-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2h7nt\" (UID: \"bbdadb48-e471-474c-8eff-d3acbd3b5ced\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.459385 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lnt5\" (UniqueName: \"kubernetes.io/projected/fa1343fa-debb-4834-a679-e82cec21dfda-kube-api-access-7lnt5\") pod \"machine-approver-56656f9798-46dcz\" (UID: \"fa1343fa-debb-4834-a679-e82cec21dfda\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.481498 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.492984 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493178 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3e5293a-140f-4740-adba-04f550109f8e-proxy-tls\") pod \"machine-config-controller-84d6567774-mpv8d\" (UID: \"f3e5293a-140f-4740-adba-04f550109f8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493221 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jx7z\" (UniqueName: \"kubernetes.io/projected/aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6-kube-api-access-7jx7z\") pod \"control-plane-machine-set-operator-78cbb6b69f-qd9xk\" (UID: \"aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493262 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6183b8c1-817c-4778-8310-b0481ebcc004-config\") pod \"service-ca-operator-777779d784-sr2kb\" (UID: \"6183b8c1-817c-4778-8310-b0481ebcc004\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493300 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sstrq\" (UniqueName: \"kubernetes.io/projected/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-kube-api-access-sstrq\") pod \"collect-profiles-29567040-v2h7s\" (UID: \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493326 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2zw7\" (UniqueName: \"kubernetes.io/projected/69e3d1a5-5541-46de-ae32-4f5005a7a6c6-kube-api-access-c2zw7\") pod \"multus-admission-controller-857f4d67dd-4hmnl\" (UID: \"69e3d1a5-5541-46de-ae32-4f5005a7a6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4hmnl" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493350 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-plugins-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493386 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc77181-1e16-4106-8433-da8e839d8275-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-696kc\" (UID: \"dbc77181-1e16-4106-8433-da8e839d8275\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493412 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f89383d-9d8a-4355-95d2-79081dda4a71-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-trjvs\" (UID: \"9f89383d-9d8a-4355-95d2-79081dda4a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493430 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kb6s\" (UniqueName: \"kubernetes.io/projected/c8619d97-0550-44ab-b54a-9bb0c275d6d0-kube-api-access-4kb6s\") pod \"ingress-canary-p97v5\" (UID: \"c8619d97-0550-44ab-b54a-9bb0c275d6d0\") " pod="openshift-ingress-canary/ingress-canary-p97v5" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493446 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-socket-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493462 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-csi-data-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493479 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2c9b8a5-ad39-4793-b85e-6282872b25f6-proxy-tls\") pod \"machine-config-operator-74547568cd-jfntr\" (UID: \"c2c9b8a5-ad39-4793-b85e-6282872b25f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493504 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzhm8\" (UniqueName: \"kubernetes.io/projected/c2c9b8a5-ad39-4793-b85e-6282872b25f6-kube-api-access-mzhm8\") pod \"machine-config-operator-74547568cd-jfntr\" (UID: \"c2c9b8a5-ad39-4793-b85e-6282872b25f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493521 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4437a79c-97de-40ff-a2fa-d29cc2f86828-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6h42j\" (UID: \"4437a79c-97de-40ff-a2fa-d29cc2f86828\") " pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493537 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8swt\" (UniqueName: \"kubernetes.io/projected/4437a79c-97de-40ff-a2fa-d29cc2f86828-kube-api-access-w8swt\") pod \"marketplace-operator-79b997595-6h42j\" (UID: \"4437a79c-97de-40ff-a2fa-d29cc2f86828\") " pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493555 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/69e3d1a5-5541-46de-ae32-4f5005a7a6c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4hmnl\" (UID: \"69e3d1a5-5541-46de-ae32-4f5005a7a6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4hmnl" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493570 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4437a79c-97de-40ff-a2fa-d29cc2f86828-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6h42j\" (UID: \"4437a79c-97de-40ff-a2fa-d29cc2f86828\") " pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493597 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c2c9b8a5-ad39-4793-b85e-6282872b25f6-images\") pod \"machine-config-operator-74547568cd-jfntr\" (UID: \"c2c9b8a5-ad39-4793-b85e-6282872b25f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493639 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hx92\" (UniqueName: \"kubernetes.io/projected/69fd6ec4-db77-4309-977a-7e80359aec50-kube-api-access-9hx92\") pod \"auto-csr-approver-29567044-8vwl5\" (UID: \"69fd6ec4-db77-4309-977a-7e80359aec50\") " pod="openshift-infra/auto-csr-approver-29567044-8vwl5" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493688 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-mountpoint-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493714 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mbcc\" (UniqueName: \"kubernetes.io/projected/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-kube-api-access-5mbcc\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493740 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8619d97-0550-44ab-b54a-9bb0c275d6d0-cert\") pod \"ingress-canary-p97v5\" (UID: \"c8619d97-0550-44ab-b54a-9bb0c275d6d0\") " pod="openshift-ingress-canary/ingress-canary-p97v5" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493760 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89g5h\" (UniqueName: \"kubernetes.io/projected/6183b8c1-817c-4778-8310-b0481ebcc004-kube-api-access-89g5h\") pod \"service-ca-operator-777779d784-sr2kb\" (UID: \"6183b8c1-817c-4778-8310-b0481ebcc004\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493782 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/844b5b79-0eb3-4e21-8754-8b10a667f6d0-certs\") pod \"machine-config-server-248bf\" (UID: \"844b5b79-0eb3-4e21-8754-8b10a667f6d0\") " pod="openshift-machine-config-operator/machine-config-server-248bf" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493802 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7bf1b0e-5792-415a-8c81-2dafe6019fac-config-volume\") pod \"dns-default-qcjhm\" (UID: \"c7bf1b0e-5792-415a-8c81-2dafe6019fac\") " pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493831 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c2c9b8a5-ad39-4793-b85e-6282872b25f6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jfntr\" (UID: \"c2c9b8a5-ad39-4793-b85e-6282872b25f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493849 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5h2b\" (UniqueName: \"kubernetes.io/projected/844b5b79-0eb3-4e21-8754-8b10a667f6d0-kube-api-access-w5h2b\") pod \"machine-config-server-248bf\" (UID: \"844b5b79-0eb3-4e21-8754-8b10a667f6d0\") " pod="openshift-machine-config-operator/machine-config-server-248bf" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493874 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3e5293a-140f-4740-adba-04f550109f8e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mpv8d\" (UID: \"f3e5293a-140f-4740-adba-04f550109f8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493908 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4n82\" (UniqueName: \"kubernetes.io/projected/dbc77181-1e16-4106-8433-da8e839d8275-kube-api-access-b4n82\") pod \"package-server-manager-789f6589d5-696kc\" (UID: \"dbc77181-1e16-4106-8433-da8e839d8275\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493926 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-secret-volume\") pod \"collect-profiles-29567040-v2h7s\" (UID: \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493955 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-registration-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493970 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s87tn\" (UniqueName: \"kubernetes.io/projected/f3e5293a-140f-4740-adba-04f550109f8e-kube-api-access-s87tn\") pod \"machine-config-controller-84d6567774-mpv8d\" (UID: \"f3e5293a-140f-4740-adba-04f550109f8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.493988 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f89383d-9d8a-4355-95d2-79081dda4a71-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-trjvs\" (UID: \"9f89383d-9d8a-4355-95d2-79081dda4a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.494003 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-config-volume\") pod \"collect-profiles-29567040-v2h7s\" (UID: \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.494031 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbnk5\" (UniqueName: \"kubernetes.io/projected/9f89383d-9d8a-4355-95d2-79081dda4a71-kube-api-access-hbnk5\") pod \"kube-storage-version-migrator-operator-b67b599dd-trjvs\" (UID: \"9f89383d-9d8a-4355-95d2-79081dda4a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.494047 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7bf1b0e-5792-415a-8c81-2dafe6019fac-metrics-tls\") pod \"dns-default-qcjhm\" (UID: \"c7bf1b0e-5792-415a-8c81-2dafe6019fac\") " pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.494067 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qd9xk\" (UID: \"aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.494092 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6183b8c1-817c-4778-8310-b0481ebcc004-serving-cert\") pod \"service-ca-operator-777779d784-sr2kb\" (UID: \"6183b8c1-817c-4778-8310-b0481ebcc004\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.494110 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/844b5b79-0eb3-4e21-8754-8b10a667f6d0-node-bootstrap-token\") pod \"machine-config-server-248bf\" (UID: \"844b5b79-0eb3-4e21-8754-8b10a667f6d0\") " pod="openshift-machine-config-operator/machine-config-server-248bf" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.494125 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbjck\" (UniqueName: \"kubernetes.io/projected/c7bf1b0e-5792-415a-8c81-2dafe6019fac-kube-api-access-pbjck\") pod \"dns-default-qcjhm\" (UID: \"c7bf1b0e-5792-415a-8c81-2dafe6019fac\") " pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:20 crc kubenswrapper[4708]: E0320 16:04:20.494359 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:20.994343967 +0000 UTC m=+215.668680682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.500258 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f0d60b9-4e7c-4ee8-a7f9-93f217c81603-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xfcd6\" (UID: \"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.500613 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-mountpoint-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.501034 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-csi-data-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.501455 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-plugins-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.501516 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6183b8c1-817c-4778-8310-b0481ebcc004-config\") pod \"service-ca-operator-777779d784-sr2kb\" (UID: \"6183b8c1-817c-4778-8310-b0481ebcc004\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.502478 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c2c9b8a5-ad39-4793-b85e-6282872b25f6-images\") pod \"machine-config-operator-74547568cd-jfntr\" (UID: \"c2c9b8a5-ad39-4793-b85e-6282872b25f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.502831 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3e5293a-140f-4740-adba-04f550109f8e-proxy-tls\") pod \"machine-config-controller-84d6567774-mpv8d\" (UID: \"f3e5293a-140f-4740-adba-04f550109f8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.503302 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-registration-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.506405 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7bf1b0e-5792-415a-8c81-2dafe6019fac-config-volume\") pod \"dns-default-qcjhm\" (UID: \"c7bf1b0e-5792-415a-8c81-2dafe6019fac\") " pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.507447 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc77181-1e16-4106-8433-da8e839d8275-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-696kc\" (UID: \"dbc77181-1e16-4106-8433-da8e839d8275\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.508531 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2c9b8a5-ad39-4793-b85e-6282872b25f6-proxy-tls\") pod \"machine-config-operator-74547568cd-jfntr\" (UID: \"c2c9b8a5-ad39-4793-b85e-6282872b25f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.508619 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-secret-volume\") pod \"collect-profiles-29567040-v2h7s\" (UID: \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.510368 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4437a79c-97de-40ff-a2fa-d29cc2f86828-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6h42j\" (UID: \"4437a79c-97de-40ff-a2fa-d29cc2f86828\") " pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.511143 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/69e3d1a5-5541-46de-ae32-4f5005a7a6c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4hmnl\" (UID: \"69e3d1a5-5541-46de-ae32-4f5005a7a6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4hmnl" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.511533 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8619d97-0550-44ab-b54a-9bb0c275d6d0-cert\") pod \"ingress-canary-p97v5\" (UID: \"c8619d97-0550-44ab-b54a-9bb0c275d6d0\") " pod="openshift-ingress-canary/ingress-canary-p97v5" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.511921 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3e5293a-140f-4740-adba-04f550109f8e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mpv8d\" (UID: \"f3e5293a-140f-4740-adba-04f550109f8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.512071 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/844b5b79-0eb3-4e21-8754-8b10a667f6d0-certs\") pod \"machine-config-server-248bf\" (UID: \"844b5b79-0eb3-4e21-8754-8b10a667f6d0\") " pod="openshift-machine-config-operator/machine-config-server-248bf" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.512264 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qd9xk\" (UID: \"aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.512341 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-socket-dir\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.512654 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-config-volume\") pod \"collect-profiles-29567040-v2h7s\" (UID: \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.515461 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6183b8c1-817c-4778-8310-b0481ebcc004-serving-cert\") pod \"service-ca-operator-777779d784-sr2kb\" (UID: \"6183b8c1-817c-4778-8310-b0481ebcc004\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.515477 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f89383d-9d8a-4355-95d2-79081dda4a71-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-trjvs\" (UID: \"9f89383d-9d8a-4355-95d2-79081dda4a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.516503 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4437a79c-97de-40ff-a2fa-d29cc2f86828-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6h42j\" (UID: \"4437a79c-97de-40ff-a2fa-d29cc2f86828\") " pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.516822 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c2c9b8a5-ad39-4793-b85e-6282872b25f6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jfntr\" (UID: \"c2c9b8a5-ad39-4793-b85e-6282872b25f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.519067 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/844b5b79-0eb3-4e21-8754-8b10a667f6d0-node-bootstrap-token\") pod \"machine-config-server-248bf\" (UID: \"844b5b79-0eb3-4e21-8754-8b10a667f6d0\") " pod="openshift-machine-config-operator/machine-config-server-248bf" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.519598 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7bf1b0e-5792-415a-8c81-2dafe6019fac-metrics-tls\") pod \"dns-default-qcjhm\" (UID: \"c7bf1b0e-5792-415a-8c81-2dafe6019fac\") " pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.522052 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-848jx\" (UniqueName: \"kubernetes.io/projected/60382231-11cf-4076-b6fa-2e6277ab675f-kube-api-access-848jx\") pod \"migrator-59844c95c7-7hgxt\" (UID: \"60382231-11cf-4076-b6fa-2e6277ab675f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hgxt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.530412 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f3f468de-b540-426b-8b2f-303230c91fd3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hrsxq\" (UID: \"f3f468de-b540-426b-8b2f-303230c91fd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.530726 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f89383d-9d8a-4355-95d2-79081dda4a71-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-trjvs\" (UID: \"9f89383d-9d8a-4355-95d2-79081dda4a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.542795 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjh6x\" (UniqueName: \"kubernetes.io/projected/f3f468de-b540-426b-8b2f-303230c91fd3-kube-api-access-cjh6x\") pod \"cluster-image-registry-operator-dc59b4c8b-hrsxq\" (UID: \"f3f468de-b540-426b-8b2f-303230c91fd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.543310 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.549625 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hgxt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.558541 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.574418 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.574735 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xvmh\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-kube-api-access-6xvmh\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.585257 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.586494 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkqpp\" (UniqueName: \"kubernetes.io/projected/c5558439-83ed-4f0d-ae59-59b79ac23667-kube-api-access-bkqpp\") pod \"olm-operator-6b444d44fb-8bbz9\" (UID: \"c5558439-83ed-4f0d-ae59-59b79ac23667\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.596080 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: E0320 16:04:20.597315 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:21.097300776 +0000 UTC m=+215.771637491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.604311 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.625347 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2bgzp"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.629900 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jscz\" (UniqueName: \"kubernetes.io/projected/a22db040-c541-4ade-8099-899f3581d6c6-kube-api-access-4jscz\") pod \"router-default-5444994796-8rllx\" (UID: \"a22db040-c541-4ade-8099-899f3581d6c6\") " pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.643262 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.656929 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqrlp\" (UniqueName: \"kubernetes.io/projected/acda3608-876c-4788-8310-481241fc9fd5-kube-api-access-cqrlp\") pod \"service-ca-9c57cc56f-5mvhn\" (UID: \"acda3608-876c-4788-8310-481241fc9fd5\") " pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.672175 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpk69"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.672432 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-bound-sa-token\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.689593 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cqrs\" (UniqueName: \"kubernetes.io/projected/2f0d60b9-4e7c-4ee8-a7f9-93f217c81603-kube-api-access-9cqrs\") pod \"ingress-operator-5b745b69d9-xfcd6\" (UID: \"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.699096 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:20 crc kubenswrapper[4708]: E0320 16:04:20.699318 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:21.199290357 +0000 UTC m=+215.873627072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.699578 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: E0320 16:04:20.700009 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:21.199992528 +0000 UTC m=+215.874329243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.720995 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbjck\" (UniqueName: \"kubernetes.io/projected/c7bf1b0e-5792-415a-8c81-2dafe6019fac-kube-api-access-pbjck\") pod \"dns-default-qcjhm\" (UID: \"c7bf1b0e-5792-415a-8c81-2dafe6019fac\") " pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.730938 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jx7z\" (UniqueName: \"kubernetes.io/projected/aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6-kube-api-access-7jx7z\") pod \"control-plane-machine-set-operator-78cbb6b69f-qd9xk\" (UID: \"aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.747788 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.755068 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sstrq\" (UniqueName: \"kubernetes.io/projected/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-kube-api-access-sstrq\") pod \"collect-profiles-29567040-v2h7s\" (UID: \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.779444 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2zw7\" (UniqueName: \"kubernetes.io/projected/69e3d1a5-5541-46de-ae32-4f5005a7a6c6-kube-api-access-c2zw7\") pod \"multus-admission-controller-857f4d67dd-4hmnl\" (UID: \"69e3d1a5-5541-46de-ae32-4f5005a7a6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4hmnl" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.790487 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-762wg"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.791114 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.798196 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.799480 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8swt\" (UniqueName: \"kubernetes.io/projected/4437a79c-97de-40ff-a2fa-d29cc2f86828-kube-api-access-w8swt\") pod \"marketplace-operator-79b997595-6h42j\" (UID: \"4437a79c-97de-40ff-a2fa-d29cc2f86828\") " pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.802469 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:20 crc kubenswrapper[4708]: E0320 16:04:20.803134 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:21.30309127 +0000 UTC m=+215.977427985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.803328 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s87tn\" (UniqueName: \"kubernetes.io/projected/f3e5293a-140f-4740-adba-04f550109f8e-kube-api-access-s87tn\") pod \"machine-config-controller-84d6567774-mpv8d\" (UID: \"f3e5293a-140f-4740-adba-04f550109f8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.805558 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.815653 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qh88t"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.819046 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.830306 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzhm8\" (UniqueName: \"kubernetes.io/projected/c2c9b8a5-ad39-4793-b85e-6282872b25f6-kube-api-access-mzhm8\") pod \"machine-config-operator-74547568cd-jfntr\" (UID: \"c2c9b8a5-ad39-4793-b85e-6282872b25f6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.831108 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.855050 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89g5h\" (UniqueName: \"kubernetes.io/projected/6183b8c1-817c-4778-8310-b0481ebcc004-kube-api-access-89g5h\") pod \"service-ca-operator-777779d784-sr2kb\" (UID: \"6183b8c1-817c-4778-8310-b0481ebcc004\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" Mar 20 16:04:20 crc kubenswrapper[4708]: W0320 16:04:20.869261 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda62ab634_c377_4878_808b_ebaaf2e87c8a.slice/crio-dfb4e696e9bafd9a96500e271bc6534eb951804ecdd3a61f9498fe1373096d8f WatchSource:0}: Error finding container dfb4e696e9bafd9a96500e271bc6534eb951804ecdd3a61f9498fe1373096d8f: Status 404 returned error can't find the container with id dfb4e696e9bafd9a96500e271bc6534eb951804ecdd3a61f9498fe1373096d8f Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.875714 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hx92\" (UniqueName: \"kubernetes.io/projected/69fd6ec4-db77-4309-977a-7e80359aec50-kube-api-access-9hx92\") pod \"auto-csr-approver-29567044-8vwl5\" (UID: \"69fd6ec4-db77-4309-977a-7e80359aec50\") " pod="openshift-infra/auto-csr-approver-29567044-8vwl5" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.892472 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mbcc\" (UniqueName: \"kubernetes.io/projected/05d36f6d-3186-48d6-8388-b4c1b6e02bd2-kube-api-access-5mbcc\") pod \"csi-hostpathplugin-w48dx\" (UID: \"05d36f6d-3186-48d6-8388-b4c1b6e02bd2\") " pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.893982 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.902870 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.904979 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:20 crc kubenswrapper[4708]: E0320 16:04:20.905357 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:21.405344849 +0000 UTC m=+216.079681554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.909268 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5h2b\" (UniqueName: \"kubernetes.io/projected/844b5b79-0eb3-4e21-8754-8b10a667f6d0-kube-api-access-w5h2b\") pod \"machine-config-server-248bf\" (UID: \"844b5b79-0eb3-4e21-8754-8b10a667f6d0\") " pod="openshift-machine-config-operator/machine-config-server-248bf" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.922089 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-8vwl5" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.926166 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4n82\" (UniqueName: \"kubernetes.io/projected/dbc77181-1e16-4106-8433-da8e839d8275-kube-api-access-b4n82\") pod \"package-server-manager-789f6589d5-696kc\" (UID: \"dbc77181-1e16-4106-8433-da8e839d8275\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.927715 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.932499 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7hgxt"] Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.933862 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.938969 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbnk5\" (UniqueName: \"kubernetes.io/projected/9f89383d-9d8a-4355-95d2-79081dda4a71-kube-api-access-hbnk5\") pod \"kube-storage-version-migrator-operator-b67b599dd-trjvs\" (UID: \"9f89383d-9d8a-4355-95d2-79081dda4a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.945129 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.950713 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.959994 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" Mar 20 16:04:20 crc kubenswrapper[4708]: W0320 16:04:20.961700 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa1343fa_debb_4834_a679_e82cec21dfda.slice/crio-13cbcf1c7905441a55a7a0b79ed593ca20cfdd894cd22ac4c7e05fb67e626d80 WatchSource:0}: Error finding container 13cbcf1c7905441a55a7a0b79ed593ca20cfdd894cd22ac4c7e05fb67e626d80: Status 404 returned error can't find the container with id 13cbcf1c7905441a55a7a0b79ed593ca20cfdd894cd22ac4c7e05fb67e626d80 Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.964924 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kb6s\" (UniqueName: \"kubernetes.io/projected/c8619d97-0550-44ab-b54a-9bb0c275d6d0-kube-api-access-4kb6s\") pod \"ingress-canary-p97v5\" (UID: \"c8619d97-0550-44ab-b54a-9bb0c275d6d0\") " pod="openshift-ingress-canary/ingress-canary-p97v5" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.968822 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4hmnl" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.974622 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w48dx" Mar 20 16:04:20 crc kubenswrapper[4708]: I0320 16:04:20.998270 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk" Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.005272 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p97v5" Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.006266 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:21 crc kubenswrapper[4708]: E0320 16:04:21.006376 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:21.506359783 +0000 UTC m=+216.180696498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.006731 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:21 crc kubenswrapper[4708]: E0320 16:04:21.007443 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:21.507413363 +0000 UTC m=+216.181750088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.012448 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-248bf" Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.018249 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs"] Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.027829 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n"] Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.038138 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kdrms" event={"ID":"77884518-e4d9-4a61-b8fb-55b1e2f9e23a","Type":"ContainerStarted","Data":"d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.038183 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kdrms" event={"ID":"77884518-e4d9-4a61-b8fb-55b1e2f9e23a","Type":"ContainerStarted","Data":"e6dc4f59b5c56a31fb8caa657527ba5b0d8ae42ee0a55d62c4db3d20467d88b2"} Mar 20 16:04:21 crc kubenswrapper[4708]: W0320 16:04:21.044965 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60382231_11cf_4076_b6fa_2e6277ab675f.slice/crio-ccb9d87dfa0445de9ef597ff4d50b84cd14d3f4c3727215a1ee7ee3e5853352f WatchSource:0}: Error finding container ccb9d87dfa0445de9ef597ff4d50b84cd14d3f4c3727215a1ee7ee3e5853352f: Status 404 returned error can't find the container with id ccb9d87dfa0445de9ef597ff4d50b84cd14d3f4c3727215a1ee7ee3e5853352f Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.052442 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rsr2h" event={"ID":"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb","Type":"ContainerStarted","Data":"e048de819357c57243b344137408a09ac8b83b9d35f53b6eafdca7b81708d62e"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.052601 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rsr2h" event={"ID":"4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb","Type":"ContainerStarted","Data":"fdd436b41a97ecaece6b3c96c1cdedd75829c99d8ad2239fed69d01878f3a65d"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.052647 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.075161 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" event={"ID":"93dc952e-1f6a-4e6f-b3ce-3665b4129805","Type":"ContainerStarted","Data":"15e05d0f2049344f3987e7ca7fd1e87c7d36c5a6461726c25a6a62647dc81ed8"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.082141 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt"] Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.083648 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk"] Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.089876 4708 patch_prober.go:28] interesting pod/console-operator-58897d9998-rsr2h container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.089924 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rsr2h" podUID="4761b345-6a69-4bcb-9e2a-dbe6b9f63ddb" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.099043 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" event={"ID":"0619cccd-a201-4a73-9d07-01ccc4eb7c84","Type":"ContainerStarted","Data":"8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.099158 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" event={"ID":"0619cccd-a201-4a73-9d07-01ccc4eb7c84","Type":"ContainerStarted","Data":"55ac97af6f02bc45e4b2092f6f217a91920ecbd914fc89f561973ac099d2ec7d"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.101637 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.110124 4708 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j74qm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.110239 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" podUID="0619cccd-a201-4a73-9d07-01ccc4eb7c84" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.110594 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:21 crc kubenswrapper[4708]: E0320 16:04:21.112660 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:21.612640996 +0000 UTC m=+216.286977701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.132543 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" event={"ID":"68a687d9-a448-4a5c-b7b9-e4510468b3c9","Type":"ContainerStarted","Data":"13cd6ad7efc2f4fc6c79fc0cc717c27d0eb3dc0f221b73e60c03555ebeb9c6cf"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.138133 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq"] Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.147094 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qh88t" event={"ID":"a62ab634-c377-4878-808b-ebaaf2e87c8a","Type":"ContainerStarted","Data":"dfb4e696e9bafd9a96500e271bc6534eb951804ecdd3a61f9498fe1373096d8f"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.164416 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" event={"ID":"fe734cb0-fdf7-45a6-9a68-a12457600931","Type":"ContainerStarted","Data":"2f271d210c9aaf2ab17ec3ce42a4a54aa064538a52c1241ff4683d73fd6d55be"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.164457 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" event={"ID":"fe734cb0-fdf7-45a6-9a68-a12457600931","Type":"ContainerStarted","Data":"6d1641de03af17efb6514646452e2644959064f081440d689a27ab40f002c995"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.169447 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn"] Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.178209 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.195362 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" event={"ID":"716e1008-4ee5-42c3-9b4a-5c85a53489e0","Type":"ContainerStarted","Data":"09763590777e49ffa6461a729dc5d5eaa82205e5db109bc254941209aa6330a4"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.195436 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" event={"ID":"716e1008-4ee5-42c3-9b4a-5c85a53489e0","Type":"ContainerStarted","Data":"eeb1be318e432261a83a8494154580770199da10c263346d18a464218051c089"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.213482 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:21 crc kubenswrapper[4708]: E0320 16:04:21.215923 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:21.715910084 +0000 UTC m=+216.390246799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.229568 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" event={"ID":"97a30f73-c3c7-4e96-b68b-9d92fea63eb8","Type":"ContainerStarted","Data":"18f538069107c928bfdeaf45641b94b1665d933c6b79073680468336dcf9ce7b"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.235960 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9"] Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.261350 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" event={"ID":"50baa880-1d72-48c6-b370-2f0094a30f23","Type":"ContainerStarted","Data":"40a92ad37b1e01c09549893c899686ebcf67e7c0f16256307e2bad38a2472e92"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.262364 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" event={"ID":"474a2884-6607-4971-9808-70ec4bc3796d","Type":"ContainerStarted","Data":"11d465c8cab33f23627b3f4ab5b1bc9e7a6c9e821ca7eec0cfa7d9637c87041c"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.267865 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2bgzp" event={"ID":"0a5fa365-0110-4a06-b2d7-cfd9b5745603","Type":"ContainerStarted","Data":"e4b04866ac3a0b9353d28d391fc08c1f719167cbd88dacce8fe5aa919ee1db52"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.267892 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2bgzp" event={"ID":"0a5fa365-0110-4a06-b2d7-cfd9b5745603","Type":"ContainerStarted","Data":"39d28479fa419c85df4647f6a5a8ee2a885f630a13a99757475728eb6b9d5c99"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.268755 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2bgzp" Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.271816 4708 generic.go:334] "Generic (PLEG): container finished" podID="55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3" containerID="75a485e3f366cd98af8547fa849c22aea2caa2d4c44f959b1de603d4a856dc03" exitCode=0 Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.271993 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" event={"ID":"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3","Type":"ContainerDied","Data":"75a485e3f366cd98af8547fa849c22aea2caa2d4c44f959b1de603d4a856dc03"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.272071 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" event={"ID":"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3","Type":"ContainerStarted","Data":"441e82f1f67c60df82dbbb1e993aae8d4360d23a006dd8667c001972779f2f8f"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.274359 4708 patch_prober.go:28] interesting pod/downloads-7954f5f757-2bgzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.274394 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2bgzp" podUID="0a5fa365-0110-4a06-b2d7-cfd9b5745603" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.290389 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" event={"ID":"fa1343fa-debb-4834-a679-e82cec21dfda","Type":"ContainerStarted","Data":"13cbcf1c7905441a55a7a0b79ed593ca20cfdd894cd22ac4c7e05fb67e626d80"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.304141 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" event={"ID":"48933855-7a14-47ec-a83d-1787cb444869","Type":"ContainerStarted","Data":"225cc625a93b9469ce3c82fc4c970b085485cd97803c1145c47423941f462ec7"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.304182 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" event={"ID":"48933855-7a14-47ec-a83d-1787cb444869","Type":"ContainerStarted","Data":"3c3a82afd4f31136cde8a91b75325ab723d30ef6a0bfd40c21ce1bfca69322d7"} Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.316907 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:21 crc kubenswrapper[4708]: E0320 16:04:21.318412 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:21.818388199 +0000 UTC m=+216.492724914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.324298 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.423230 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:21 crc kubenswrapper[4708]: E0320 16:04:21.425112 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:21.925100554 +0000 UTC m=+216.599437269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.433911 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5mvhn"] Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.478874 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6"] Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.525155 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:21 crc kubenswrapper[4708]: E0320 16:04:21.525605 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:22.025589974 +0000 UTC m=+216.699926679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.626745 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:21 crc kubenswrapper[4708]: E0320 16:04:21.627628 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:22.127612386 +0000 UTC m=+216.801949101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.649553 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j74qm"] Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.664167 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7"] Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.731139 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:21 crc kubenswrapper[4708]: E0320 16:04:21.732213 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:22.231651596 +0000 UTC m=+216.905988311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:21 crc kubenswrapper[4708]: W0320 16:04:21.784582 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f0d60b9_4e7c_4ee8_a7f9_93f217c81603.slice/crio-700f074c3364fad53c6b486abbce78f52267960d9ec5431b5c7d9306a4cb97fe WatchSource:0}: Error finding container 700f074c3364fad53c6b486abbce78f52267960d9ec5431b5c7d9306a4cb97fe: Status 404 returned error can't find the container with id 700f074c3364fad53c6b486abbce78f52267960d9ec5431b5c7d9306a4cb97fe Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.832840 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:21 crc kubenswrapper[4708]: E0320 16:04:21.838155 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:22.338123544 +0000 UTC m=+217.012460269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.941115 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:21 crc kubenswrapper[4708]: E0320 16:04:21.941264 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:22.441246048 +0000 UTC m=+217.115582763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:21 crc kubenswrapper[4708]: I0320 16:04:21.941421 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:21 crc kubenswrapper[4708]: E0320 16:04:21.941815 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:22.441807575 +0000 UTC m=+217.116144290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.008357 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4hmnl"] Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.043273 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:22 crc kubenswrapper[4708]: E0320 16:04:22.043434 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:22.543403924 +0000 UTC m=+217.217740639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.043951 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:22 crc kubenswrapper[4708]: E0320 16:04:22.044315 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:22.54430675 +0000 UTC m=+217.218643465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.054394 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d"] Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.147399 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:22 crc kubenswrapper[4708]: E0320 16:04:22.147830 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:22.647812925 +0000 UTC m=+217.322149640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.248792 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:22 crc kubenswrapper[4708]: E0320 16:04:22.249382 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:22.749368504 +0000 UTC m=+217.423705209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:22 crc kubenswrapper[4708]: W0320 16:04:22.285067 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e3d1a5_5541_46de_ae32_4f5005a7a6c6.slice/crio-344e93eec5d1805efdfdf027ab367189197ea0475d1e0312a482e02c735e314a WatchSource:0}: Error finding container 344e93eec5d1805efdfdf027ab367189197ea0475d1e0312a482e02c735e314a: Status 404 returned error can't find the container with id 344e93eec5d1805efdfdf027ab367189197ea0475d1e0312a482e02c735e314a Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.348356 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8rllx" event={"ID":"a22db040-c541-4ade-8099-899f3581d6c6","Type":"ContainerStarted","Data":"b522673411e42c0a8b0b3e42ca76cc8f3d2fca2b4d11c9f4820e3b61c07d9566"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.353223 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:22 crc kubenswrapper[4708]: E0320 16:04:22.353786 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:22.853746043 +0000 UTC m=+217.528082758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.358538 4708 generic.go:334] "Generic (PLEG): container finished" podID="93dc952e-1f6a-4e6f-b3ce-3665b4129805" containerID="308f94902718657d25b1a82532666b4e87e26b57842bb0d8cfe86cff849383a3" exitCode=0 Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.358644 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" event={"ID":"93dc952e-1f6a-4e6f-b3ce-3665b4129805","Type":"ContainerDied","Data":"308f94902718657d25b1a82532666b4e87e26b57842bb0d8cfe86cff849383a3"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.392149 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" event={"ID":"97a30f73-c3c7-4e96-b68b-9d92fea63eb8","Type":"ContainerStarted","Data":"8984bc4e832cc5c2fdea3f1e0e28524c30edf64d5d1d5cf37e6e02c6db0b76c7"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.392643 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.410938 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" event={"ID":"f3e5293a-140f-4740-adba-04f550109f8e","Type":"ContainerStarted","Data":"1efea766cb9db6b54221ca36dfbee1cc7518d36d606466aec68a5c57ffbd4a38"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.424358 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" event={"ID":"c5558439-83ed-4f0d-ae59-59b79ac23667","Type":"ContainerStarted","Data":"b8e72be7334f4c20776de04950a65aac33ca6d5ab3e376b3cb82f4a55801b210"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.443078 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" event={"ID":"bbdadb48-e471-474c-8eff-d3acbd3b5ced","Type":"ContainerStarted","Data":"8cea11873e919461edf84c5d83bf4281b09b701861e8f25342ea2b2f30d5728e"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.455193 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" event={"ID":"722ff406-045d-48b6-a329-df2851889a3a","Type":"ContainerStarted","Data":"d05809802cecad2bf547f84626fac58e9d7f9608dafcf29bedf55353b299948a"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.456841 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.457085 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-q9z2q" podStartSLOduration=161.457062043 podStartE2EDuration="2m41.457062043s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:22.455712784 +0000 UTC m=+217.130049519" watchObservedRunningTime="2026-03-20 16:04:22.457062043 +0000 UTC m=+217.131398758" Mar 20 16:04:22 crc kubenswrapper[4708]: E0320 16:04:22.458656 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:22.958642277 +0000 UTC m=+217.632978992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.473866 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n" event={"ID":"d4a86699-e0df-47a7-a7d6-50ad108ffaae","Type":"ContainerStarted","Data":"ed6e336ab2d6f56816bb386d74076911f57fb98fe3efc659da6af439683e2d52"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.476948 4708 generic.go:334] "Generic (PLEG): container finished" podID="50baa880-1d72-48c6-b370-2f0094a30f23" containerID="868849477fac4083dde2647228473ea44dff2b0dbfc52912f1cb4067b6e6fdf7" exitCode=0 Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.477001 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" event={"ID":"50baa880-1d72-48c6-b370-2f0094a30f23","Type":"ContainerDied","Data":"868849477fac4083dde2647228473ea44dff2b0dbfc52912f1cb4067b6e6fdf7"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.487652 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" event={"ID":"f3f468de-b540-426b-8b2f-303230c91fd3","Type":"ContainerStarted","Data":"a21a55ab945603cfc5e6b9b78f161365b28180ac6e82027aa46749666b2d6fc7"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.507361 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ftsws" podStartSLOduration=162.507345427 podStartE2EDuration="2m42.507345427s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:22.505011612 +0000 UTC m=+217.179348327" watchObservedRunningTime="2026-03-20 16:04:22.507345427 +0000 UTC m=+217.181682142" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.536719 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2bgzp" podStartSLOduration=161.53670033 podStartE2EDuration="2m41.53670033s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:22.535702242 +0000 UTC m=+217.210038957" watchObservedRunningTime="2026-03-20 16:04:22.53670033 +0000 UTC m=+217.211037045" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.558027 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:22 crc kubenswrapper[4708]: E0320 16:04:22.559169 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:23.059147336 +0000 UTC m=+217.733484051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.571413 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" event={"ID":"68a687d9-a448-4a5c-b7b9-e4510468b3c9","Type":"ContainerStarted","Data":"abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.571457 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.598531 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qh88t" event={"ID":"a62ab634-c377-4878-808b-ebaaf2e87c8a","Type":"ContainerStarted","Data":"afc0f39a2b27f79a38c85a9c2478744ba8076e1b5f1f83c507ae99661912ceee"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.620278 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-r95lq" podStartSLOduration=162.620256039 podStartE2EDuration="2m42.620256039s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:22.593374967 +0000 UTC m=+217.267711682" watchObservedRunningTime="2026-03-20 16:04:22.620256039 +0000 UTC m=+217.294592764" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.622474 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6c49" podStartSLOduration=161.622466012 podStartE2EDuration="2m41.622466012s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:22.619320612 +0000 UTC m=+217.293657327" watchObservedRunningTime="2026-03-20 16:04:22.622466012 +0000 UTC m=+217.296802727" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.653184 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" event={"ID":"dc14e84a-f5ca-4b5f-871e-059ea6092ad3","Type":"ContainerStarted","Data":"46925691a3dde5d998f50679645647e05637d18630328f3fe44cfd6b530d0a47"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.653549 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qcjhm"] Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.653567 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" event={"ID":"dc14e84a-f5ca-4b5f-871e-059ea6092ad3","Type":"ContainerStarted","Data":"aadfc21d9e3367b0678f74dbbe99c0211db363e3c47d1026f84c9a19fa63713c"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.653585 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.664224 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:22 crc kubenswrapper[4708]: E0320 16:04:22.666029 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:23.166013457 +0000 UTC m=+217.840350172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.671968 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" event={"ID":"06c196bd-6e44-4148-8fd9-a0b9e13c09f9","Type":"ContainerStarted","Data":"e25fd1d5dce86d291c3b9c356659e3418ead87b36f706decc6f2c803955993e5"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.681343 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-rsr2h" podStartSLOduration=162.681324201 podStartE2EDuration="2m42.681324201s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:22.681038602 +0000 UTC m=+217.355375317" watchObservedRunningTime="2026-03-20 16:04:22.681324201 +0000 UTC m=+217.355660916" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.686042 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb"] Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.691871 4708 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgdfs container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.691935 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" podUID="dc14e84a-f5ca-4b5f-871e-059ea6092ad3" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.692007 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-248bf" event={"ID":"844b5b79-0eb3-4e21-8754-8b10a667f6d0","Type":"ContainerStarted","Data":"b71721799b451f769974f78112031db2c4afc8f8e568fd640a038ac049ed1d7e"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.708645 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w48dx"] Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.719995 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hgxt" event={"ID":"60382231-11cf-4076-b6fa-2e6277ab675f","Type":"ContainerStarted","Data":"7e84b43b5d6c4999e64f667c4ac85ae8d9aa1f8a93bed90a24e57e40170fc687"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.721460 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hgxt" event={"ID":"60382231-11cf-4076-b6fa-2e6277ab675f","Type":"ContainerStarted","Data":"ccb9d87dfa0445de9ef597ff4d50b84cd14d3f4c3727215a1ee7ee3e5853352f"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.732444 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" event={"ID":"acda3608-876c-4788-8310-481241fc9fd5","Type":"ContainerStarted","Data":"f2c249a0495d1f5df310da1ac4203229a06e2105e9201579f9c1bf6c5c7697aa"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.733825 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" event={"ID":"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603","Type":"ContainerStarted","Data":"700f074c3364fad53c6b486abbce78f52267960d9ec5431b5c7d9306a4cb97fe"} Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.734371 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" podStartSLOduration=161.734354694 podStartE2EDuration="2m41.734354694s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:22.728812977 +0000 UTC m=+217.403149692" watchObservedRunningTime="2026-03-20 16:04:22.734354694 +0000 UTC m=+217.408691409" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.783904 4708 patch_prober.go:28] interesting pod/downloads-7954f5f757-2bgzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.783983 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2bgzp" podUID="0a5fa365-0110-4a06-b2d7-cfd9b5745603" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.786470 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:22 crc kubenswrapper[4708]: E0320 16:04:22.787556 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:23.287522622 +0000 UTC m=+217.961859337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.794320 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" podStartSLOduration=161.794303043 podStartE2EDuration="2m41.794303043s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:22.789724834 +0000 UTC m=+217.464061559" watchObservedRunningTime="2026-03-20 16:04:22.794303043 +0000 UTC m=+217.468639758" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.803491 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.874303 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk"] Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.888568 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc"] Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.888838 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:22 crc kubenswrapper[4708]: E0320 16:04:22.889131 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:23.389119242 +0000 UTC m=+218.063455957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.893023 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p97v5"] Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.895229 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr"] Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.902379 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-rsr2h" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.917865 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-kdrms" podStartSLOduration=162.917841526 podStartE2EDuration="2m42.917841526s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:22.912129044 +0000 UTC m=+217.586465759" watchObservedRunningTime="2026-03-20 16:04:22.917841526 +0000 UTC m=+217.592178241" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.953336 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s"] Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.956767 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs"] Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.972935 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" podStartSLOduration=161.972913907 podStartE2EDuration="2m41.972913907s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:22.953090195 +0000 UTC m=+217.627426910" watchObservedRunningTime="2026-03-20 16:04:22.972913907 +0000 UTC m=+217.647250622" Mar 20 16:04:22 crc kubenswrapper[4708]: I0320 16:04:22.990410 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:22 crc kubenswrapper[4708]: E0320 16:04:22.990884 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:23.490861086 +0000 UTC m=+218.165197801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.004412 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6h42j"] Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.094471 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:23 crc kubenswrapper[4708]: E0320 16:04:23.094920 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:23.594905516 +0000 UTC m=+218.269242221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.098080 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-8vwl5"] Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.195157 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:23 crc kubenswrapper[4708]: E0320 16:04:23.196062 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:23.696047043 +0000 UTC m=+218.370383758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.203473 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" podStartSLOduration=162.203455184 podStartE2EDuration="2m42.203455184s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:23.187620944 +0000 UTC m=+217.861957659" watchObservedRunningTime="2026-03-20 16:04:23.203455184 +0000 UTC m=+217.877791899" Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.223186 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" podStartSLOduration=163.223159612 podStartE2EDuration="2m43.223159612s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:23.222205475 +0000 UTC m=+217.896542190" watchObservedRunningTime="2026-03-20 16:04:23.223159612 +0000 UTC m=+217.897496317" Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.297573 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:23 crc kubenswrapper[4708]: E0320 16:04:23.298016 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:23.797999094 +0000 UTC m=+218.472335809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.298906 4708 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.392968 4708 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-r9jtp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.393039 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" podUID="97a30f73-c3c7-4e96-b68b-9d92fea63eb8" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.398927 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:23 crc kubenswrapper[4708]: E0320 16:04:23.399078 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:23.899058039 +0000 UTC m=+218.573394764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.399267 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:23 crc kubenswrapper[4708]: E0320 16:04:23.399542 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:23.899534823 +0000 UTC m=+218.573871538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.504461 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:23 crc kubenswrapper[4708]: E0320 16:04:23.507780 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:24.007759391 +0000 UTC m=+218.682096096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.508123 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:23 crc kubenswrapper[4708]: E0320 16:04:23.508423 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:24.008407649 +0000 UTC m=+218.682744364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.574822 4708 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-hpk69 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.575238 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" podUID="68a687d9-a448-4a5c-b7b9-e4510468b3c9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.614873 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:23 crc kubenswrapper[4708]: E0320 16:04:23.615347 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:24.115332511 +0000 UTC m=+218.789669226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.722052 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:23 crc kubenswrapper[4708]: E0320 16:04:23.722402 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:24.222389936 +0000 UTC m=+218.896726651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.798959 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" event={"ID":"f3f468de-b540-426b-8b2f-303230c91fd3","Type":"ContainerStarted","Data":"f70d9f3f57e67d44a33e9be0e70e597bec4f8265b5d492c943178269758a5a51"} Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.825274 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:23 crc kubenswrapper[4708]: E0320 16:04:23.826062 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:24.326039734 +0000 UTC m=+219.000376449 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.838394 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p97v5" event={"ID":"c8619d97-0550-44ab-b54a-9bb0c275d6d0","Type":"ContainerStarted","Data":"9cb88d6d8a7010b0499d5b4827daa3e7143fd16b71731feefb887301e25abbea"} Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.849077 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hrsxq" podStartSLOduration=162.849043767 podStartE2EDuration="2m42.849043767s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:23.837439817 +0000 UTC m=+218.511776522" watchObservedRunningTime="2026-03-20 16:04:23.849043767 +0000 UTC m=+218.523380492" Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.872211 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" event={"ID":"c5558439-83ed-4f0d-ae59-59b79ac23667","Type":"ContainerStarted","Data":"5a67ac59d0d88e246c70b7e2180a3ebd1c132acbd46684d51711773bf5e7699b"} Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.873604 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.895882 4708 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8bbz9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.895944 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" podUID="c5558439-83ed-4f0d-ae59-59b79ac23667" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.926826 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" event={"ID":"bbdadb48-e471-474c-8eff-d3acbd3b5ced","Type":"ContainerStarted","Data":"505cf874b868d3b9b23dd7311aeee898e382c267638737e6019cc9ee328610c5"} Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.928519 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:23 crc kubenswrapper[4708]: E0320 16:04:23.929475 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:24.429463236 +0000 UTC m=+219.103799951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.955873 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk" event={"ID":"aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6","Type":"ContainerStarted","Data":"4c2f549d7d2577e81ba042ddd4dd6a87c8832af293c3344e54007253a703c6c2"} Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.966540 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" podStartSLOduration=162.966521618 podStartE2EDuration="2m42.966521618s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:23.927703657 +0000 UTC m=+218.602040372" watchObservedRunningTime="2026-03-20 16:04:23.966521618 +0000 UTC m=+218.640858333" Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.970534 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qcjhm" event={"ID":"c7bf1b0e-5792-415a-8c81-2dafe6019fac","Type":"ContainerStarted","Data":"1b81cb0381c3b54e6c1bf0d984136239d9e6f47e3a7196ce0ef849234cd927b0"} Mar 20 16:04:23 crc kubenswrapper[4708]: I0320 16:04:23.982808 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" event={"ID":"9f89383d-9d8a-4355-95d2-79081dda4a71","Type":"ContainerStarted","Data":"a4e4253e80deea78356e91f5184869e6cd4eadfd06f5134830888889be85f4a9"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.031183 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:24 crc kubenswrapper[4708]: E0320 16:04:24.032186 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:24.532170438 +0000 UTC m=+219.206507143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.042163 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" event={"ID":"6183b8c1-817c-4778-8310-b0481ebcc004","Type":"ContainerStarted","Data":"32e91bf2330598e499eeed2b79256a1b477c311e7e0061c5975fac560f90a65c"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.070233 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" event={"ID":"93dc952e-1f6a-4e6f-b3ce-3665b4129805","Type":"ContainerStarted","Data":"9057888fd707c2f433d6901d802ad826c6549f7b3598a287d8df06c3135f6cd1"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.070906 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.072651 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2h7nt" podStartSLOduration=163.072639255 podStartE2EDuration="2m43.072639255s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:23.96872779 +0000 UTC m=+218.643064505" watchObservedRunningTime="2026-03-20 16:04:24.072639255 +0000 UTC m=+218.746975970" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.073274 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" podStartSLOduration=163.073270694 podStartE2EDuration="2m43.073270694s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.071457333 +0000 UTC m=+218.745794048" watchObservedRunningTime="2026-03-20 16:04:24.073270694 +0000 UTC m=+218.747607409" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.095092 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" event={"ID":"06c196bd-6e44-4148-8fd9-a0b9e13c09f9","Type":"ContainerStarted","Data":"d67dbc7a59e76f3d71004cdf2f36bb48f12fa825251c8b4e3b81cda1635e379d"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.105805 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" podStartSLOduration=164.105789996 podStartE2EDuration="2m44.105789996s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.101105283 +0000 UTC m=+218.775441998" watchObservedRunningTime="2026-03-20 16:04:24.105789996 +0000 UTC m=+218.780126711" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.116921 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-248bf" event={"ID":"844b5b79-0eb3-4e21-8754-8b10a667f6d0","Type":"ContainerStarted","Data":"a39fa694ea02e6cfefd89a0725151d575faf7d091f1ffd465533dab16249b6ed"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.133470 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:24 crc kubenswrapper[4708]: E0320 16:04:24.136855 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:24.636844456 +0000 UTC m=+219.311181171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.141571 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vnqpk" podStartSLOduration=163.14155553 podStartE2EDuration="2m43.14155553s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.140765517 +0000 UTC m=+218.815102252" watchObservedRunningTime="2026-03-20 16:04:24.14155553 +0000 UTC m=+218.815892245" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.148891 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" event={"ID":"4437a79c-97de-40ff-a2fa-d29cc2f86828","Type":"ContainerStarted","Data":"9baf985b21d59446d3a7141ba543e84f72f499f354436709458db0e324d3e245"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.156996 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" event={"ID":"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7","Type":"ContainerStarted","Data":"7b781e8ab5403143556a21a3b8fa3a3c45da5a63cd10c709deba74e38a90e3dd"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.187348 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-248bf" podStartSLOduration=7.187330227 podStartE2EDuration="7.187330227s" podCreationTimestamp="2026-03-20 16:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.180980828 +0000 UTC m=+218.855317543" watchObservedRunningTime="2026-03-20 16:04:24.187330227 +0000 UTC m=+218.861666942" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.196420 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8rllx" event={"ID":"a22db040-c541-4ade-8099-899f3581d6c6","Type":"ContainerStarted","Data":"a9d543056a0763e66c2a0cedc2742547fec458a0dd22be5f82f1117749d34c24"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.204900 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" event={"ID":"dbc77181-1e16-4106-8433-da8e839d8275","Type":"ContainerStarted","Data":"5b9f5fdd1c8f37e274cc0107376d64b760c120f1ffd88bc64bf7bf7f908546da"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.225905 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w48dx" event={"ID":"05d36f6d-3186-48d6-8388-b4c1b6e02bd2","Type":"ContainerStarted","Data":"0d83af344b19eeb1652608152c49e37c15afa99a846e165df0379a2a06f974bc"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.240396 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:24 crc kubenswrapper[4708]: E0320 16:04:24.241596 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:24.741573065 +0000 UTC m=+219.415909780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.248370 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8rllx" podStartSLOduration=163.248352448 podStartE2EDuration="2m43.248352448s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.244707964 +0000 UTC m=+218.919044679" watchObservedRunningTime="2026-03-20 16:04:24.248352448 +0000 UTC m=+218.922689153" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.256184 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" event={"ID":"f3e5293a-140f-4740-adba-04f550109f8e","Type":"ContainerStarted","Data":"9530a3f397a15920d788c0665d42552cac05872af192f072e4c7c9db54e673db"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.308785 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" event={"ID":"722ff406-045d-48b6-a329-df2851889a3a","Type":"ContainerStarted","Data":"3dadf53c786f014731a5567ad1202670af5f1b3ccc40f819a863e4bb70ac73b4"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.320581 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4hmnl" event={"ID":"69e3d1a5-5541-46de-ae32-4f5005a7a6c6","Type":"ContainerStarted","Data":"55870a071ca93e5332f558c4ff850bc7eb21596f1080a3a8c4239fcde6b68046"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.320637 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4hmnl" event={"ID":"69e3d1a5-5541-46de-ae32-4f5005a7a6c6","Type":"ContainerStarted","Data":"344e93eec5d1805efdfdf027ab367189197ea0475d1e0312a482e02c735e314a"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.325207 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qh88t" event={"ID":"a62ab634-c377-4878-808b-ebaaf2e87c8a","Type":"ContainerStarted","Data":"2454929c727ae7c6e5c5e14e115479217322a034430b0a29423233d004c9bc18"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.340281 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-8vwl5" event={"ID":"69fd6ec4-db77-4309-977a-7e80359aec50","Type":"ContainerStarted","Data":"a11741a8b0168b98a71d31aacc099c827f38a9461d13989422569dc7538c97fd"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.342446 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:24 crc kubenswrapper[4708]: E0320 16:04:24.343261 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:24.843241448 +0000 UTC m=+219.517578163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.378542 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qh88t" podStartSLOduration=163.378526728 podStartE2EDuration="2m43.378526728s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.377138239 +0000 UTC m=+219.051474944" watchObservedRunningTime="2026-03-20 16:04:24.378526728 +0000 UTC m=+219.052863443" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.379569 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k2mxn" podStartSLOduration=163.379561438 podStartE2EDuration="2m43.379561438s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.349466754 +0000 UTC m=+219.023803469" watchObservedRunningTime="2026-03-20 16:04:24.379561438 +0000 UTC m=+219.053898153" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.383645 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" event={"ID":"50baa880-1d72-48c6-b370-2f0094a30f23","Type":"ContainerStarted","Data":"d223a58b8b793c4f237e9eb224ad64aec0f029667c711a36b0cff705c8063673"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.428051 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" event={"ID":"55b538ad-b2b1-4b7d-87e1-5ea5038ac7f3","Type":"ContainerStarted","Data":"f0052e479c15dfa863e83cf177141616087540225f25905f4307a7dbf5bd6124"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.446004 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:24 crc kubenswrapper[4708]: E0320 16:04:24.447259 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:24.947242646 +0000 UTC m=+219.621579361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.473243 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n" event={"ID":"d4a86699-e0df-47a7-a7d6-50ad108ffaae","Type":"ContainerStarted","Data":"6caf4afd37108ad0ee1392d16d74a9ea5e0467307d73a9dfd2e97526573d8c0f"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.480839 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" podStartSLOduration=163.480818498 podStartE2EDuration="2m43.480818498s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.475287461 +0000 UTC m=+219.149624186" watchObservedRunningTime="2026-03-20 16:04:24.480818498 +0000 UTC m=+219.155155213" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.526619 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" event={"ID":"474a2884-6607-4971-9808-70ec4bc3796d","Type":"ContainerStarted","Data":"c44205abb61855cfb97ee1176a4b4fc9bc2a88d01ad1927a84f6146e0f545a15"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.533349 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n" podStartSLOduration=164.533321596 podStartE2EDuration="2m44.533321596s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.531105124 +0000 UTC m=+219.205441839" watchObservedRunningTime="2026-03-20 16:04:24.533321596 +0000 UTC m=+219.207658301" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.547353 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:24 crc kubenswrapper[4708]: E0320 16:04:24.550038 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:25.05001544 +0000 UTC m=+219.724352155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.560261 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" event={"ID":"acda3608-876c-4788-8310-481241fc9fd5","Type":"ContainerStarted","Data":"49fd28808a1147f9f917cf55c27943a2278089068fcd41bea06d93ac43c87ae3"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.578480 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" event={"ID":"c2c9b8a5-ad39-4793-b85e-6282872b25f6","Type":"ContainerStarted","Data":"ef9460e7dabbac4594114aed6d9b3287f2bccad423192ad3df579b94c45f78f2"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.592530 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-762wg" podStartSLOduration=163.592500635 podStartE2EDuration="2m43.592500635s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.591966879 +0000 UTC m=+219.266303594" watchObservedRunningTime="2026-03-20 16:04:24.592500635 +0000 UTC m=+219.266837350" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.600092 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hgxt" event={"ID":"60382231-11cf-4076-b6fa-2e6277ab675f","Type":"ContainerStarted","Data":"a98cf6ce28bc10a716c11ef99d52638ba9fd1ca304b99463ebebac1006f86e5f"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.635643 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" event={"ID":"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603","Type":"ContainerStarted","Data":"30b54d7f8c28d265ca81394feedd91f30deece6bcc14aa5a13f2d914380c314f"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.649154 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:24 crc kubenswrapper[4708]: E0320 16:04:24.649663 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:25.149629284 +0000 UTC m=+219.823965989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.664756 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" event={"ID":"fa1343fa-debb-4834-a679-e82cec21dfda","Type":"ContainerStarted","Data":"ec2008b2a1e8b2bf384ab08a18965d9afefa16f63e13af286b947e063e7e8bb4"} Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.665381 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" podUID="40d77bdf-8222-4072-bd4b-b766e73992cc" containerName="route-controller-manager" containerID="cri-o://48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0" gracePeriod=30 Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.665463 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" podUID="0619cccd-a201-4a73-9d07-01ccc4eb7c84" containerName="controller-manager" containerID="cri-o://8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0" gracePeriod=30 Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.667525 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hgxt" podStartSLOduration=163.667515021 podStartE2EDuration="2m43.667515021s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.667230733 +0000 UTC m=+219.341567448" watchObservedRunningTime="2026-03-20 16:04:24.667515021 +0000 UTC m=+219.341851736" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.670637 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5mvhn" podStartSLOduration=163.670628869 podStartE2EDuration="2m43.670628869s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.628763393 +0000 UTC m=+219.303100108" watchObservedRunningTime="2026-03-20 16:04:24.670628869 +0000 UTC m=+219.344965584" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.685598 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgdfs" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.688878 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.706245 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r9jtp" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.751150 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:24 crc kubenswrapper[4708]: E0320 16:04:24.752958 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:25.252943273 +0000 UTC m=+219.927279998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.762326 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" podStartSLOduration=163.762291338 podStartE2EDuration="2m43.762291338s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:24.712909418 +0000 UTC m=+219.387246133" watchObservedRunningTime="2026-03-20 16:04:24.762291338 +0000 UTC m=+219.436628053" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.835808 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.841055 4708 patch_prober.go:28] interesting pod/router-default-5444994796-8rllx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:04:24 crc kubenswrapper[4708]: [-]has-synced failed: reason withheld Mar 20 16:04:24 crc kubenswrapper[4708]: [+]process-running ok Mar 20 16:04:24 crc kubenswrapper[4708]: healthz check failed Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.841116 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8rllx" podUID="a22db040-c541-4ade-8099-899f3581d6c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.853949 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:24 crc kubenswrapper[4708]: E0320 16:04:24.856475 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:25.356448957 +0000 UTC m=+220.030785672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.859416 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:24 crc kubenswrapper[4708]: E0320 16:04:24.860840 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:25.360814511 +0000 UTC m=+220.035151426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:24 crc kubenswrapper[4708]: I0320 16:04:24.960654 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:24 crc kubenswrapper[4708]: E0320 16:04:24.961331 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:25.46130118 +0000 UTC m=+220.135637895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.034784 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.034844 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.063160 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:25 crc kubenswrapper[4708]: E0320 16:04:25.063720 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:25.563664352 +0000 UTC m=+220.238001207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.164681 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:25 crc kubenswrapper[4708]: E0320 16:04:25.165355 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:25.665339435 +0000 UTC m=+220.339676150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.266971 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:25 crc kubenswrapper[4708]: E0320 16:04:25.267347 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:25.767333857 +0000 UTC m=+220.441670572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.352375 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.367857 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-config\") pod \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.368105 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.368138 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvw5r\" (UniqueName: \"kubernetes.io/projected/0619cccd-a201-4a73-9d07-01ccc4eb7c84-kube-api-access-vvw5r\") pod \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.368247 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-client-ca\") pod \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.368350 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-proxy-ca-bundles\") pod \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.368376 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0619cccd-a201-4a73-9d07-01ccc4eb7c84-serving-cert\") pod \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\" (UID: \"0619cccd-a201-4a73-9d07-01ccc4eb7c84\") " Mar 20 16:04:25 crc kubenswrapper[4708]: E0320 16:04:25.371718 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:25.871687065 +0000 UTC m=+220.546023780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.372313 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0619cccd-a201-4a73-9d07-01ccc4eb7c84" (UID: "0619cccd-a201-4a73-9d07-01ccc4eb7c84"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.372468 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-config" (OuterVolumeSpecName: "config") pod "0619cccd-a201-4a73-9d07-01ccc4eb7c84" (UID: "0619cccd-a201-4a73-9d07-01ccc4eb7c84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.372540 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-client-ca" (OuterVolumeSpecName: "client-ca") pod "0619cccd-a201-4a73-9d07-01ccc4eb7c84" (UID: "0619cccd-a201-4a73-9d07-01ccc4eb7c84"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.387875 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0619cccd-a201-4a73-9d07-01ccc4eb7c84-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0619cccd-a201-4a73-9d07-01ccc4eb7c84" (UID: "0619cccd-a201-4a73-9d07-01ccc4eb7c84"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.391395 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0619cccd-a201-4a73-9d07-01ccc4eb7c84-kube-api-access-vvw5r" (OuterVolumeSpecName: "kube-api-access-vvw5r") pod "0619cccd-a201-4a73-9d07-01ccc4eb7c84" (UID: "0619cccd-a201-4a73-9d07-01ccc4eb7c84"). InnerVolumeSpecName "kube-api-access-vvw5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.413513 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb"] Mar 20 16:04:25 crc kubenswrapper[4708]: E0320 16:04:25.413819 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0619cccd-a201-4a73-9d07-01ccc4eb7c84" containerName="controller-manager" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.413834 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="0619cccd-a201-4a73-9d07-01ccc4eb7c84" containerName="controller-manager" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.413939 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="0619cccd-a201-4a73-9d07-01ccc4eb7c84" containerName="controller-manager" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.414400 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.419012 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb"] Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.449493 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.479311 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40d77bdf-8222-4072-bd4b-b766e73992cc-serving-cert\") pod \"40d77bdf-8222-4072-bd4b-b766e73992cc\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.479476 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40d77bdf-8222-4072-bd4b-b766e73992cc-config\") pod \"40d77bdf-8222-4072-bd4b-b766e73992cc\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.479706 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2bnl\" (UniqueName: \"kubernetes.io/projected/40d77bdf-8222-4072-bd4b-b766e73992cc-kube-api-access-v2bnl\") pod \"40d77bdf-8222-4072-bd4b-b766e73992cc\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.479733 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40d77bdf-8222-4072-bd4b-b766e73992cc-client-ca\") pod \"40d77bdf-8222-4072-bd4b-b766e73992cc\" (UID: \"40d77bdf-8222-4072-bd4b-b766e73992cc\") " Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.480021 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-config\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.480098 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e7a051-112d-4679-821d-75c5bbe8b51f-serving-cert\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.480141 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-client-ca\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.480214 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh5m8\" (UniqueName: \"kubernetes.io/projected/d1e7a051-112d-4679-821d-75c5bbe8b51f-kube-api-access-zh5m8\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.480266 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-proxy-ca-bundles\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.481196 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40d77bdf-8222-4072-bd4b-b766e73992cc-client-ca" (OuterVolumeSpecName: "client-ca") pod "40d77bdf-8222-4072-bd4b-b766e73992cc" (UID: "40d77bdf-8222-4072-bd4b-b766e73992cc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.481419 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40d77bdf-8222-4072-bd4b-b766e73992cc-config" (OuterVolumeSpecName: "config") pod "40d77bdf-8222-4072-bd4b-b766e73992cc" (UID: "40d77bdf-8222-4072-bd4b-b766e73992cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.481586 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.481704 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0619cccd-a201-4a73-9d07-01ccc4eb7c84-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.481718 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40d77bdf-8222-4072-bd4b-b766e73992cc-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.481728 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.481741 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvw5r\" (UniqueName: \"kubernetes.io/projected/0619cccd-a201-4a73-9d07-01ccc4eb7c84-kube-api-access-vvw5r\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.481768 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40d77bdf-8222-4072-bd4b-b766e73992cc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.481778 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.481789 4708 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0619cccd-a201-4a73-9d07-01ccc4eb7c84-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:25 crc kubenswrapper[4708]: E0320 16:04:25.482222 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:25.982205108 +0000 UTC m=+220.656541823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.500077 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40d77bdf-8222-4072-bd4b-b766e73992cc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "40d77bdf-8222-4072-bd4b-b766e73992cc" (UID: "40d77bdf-8222-4072-bd4b-b766e73992cc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.503384 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40d77bdf-8222-4072-bd4b-b766e73992cc-kube-api-access-v2bnl" (OuterVolumeSpecName: "kube-api-access-v2bnl") pod "40d77bdf-8222-4072-bd4b-b766e73992cc" (UID: "40d77bdf-8222-4072-bd4b-b766e73992cc"). InnerVolumeSpecName "kube-api-access-v2bnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.583715 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:25 crc kubenswrapper[4708]: E0320 16:04:25.583897 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:26.083871941 +0000 UTC m=+220.758208646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.584027 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-config\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.584067 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e7a051-112d-4679-821d-75c5bbe8b51f-serving-cert\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.584174 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-client-ca\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.584230 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh5m8\" (UniqueName: \"kubernetes.io/projected/d1e7a051-112d-4679-821d-75c5bbe8b51f-kube-api-access-zh5m8\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.584252 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-proxy-ca-bundles\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.584335 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.584427 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2bnl\" (UniqueName: \"kubernetes.io/projected/40d77bdf-8222-4072-bd4b-b766e73992cc-kube-api-access-v2bnl\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.584443 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40d77bdf-8222-4072-bd4b-b766e73992cc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:25 crc kubenswrapper[4708]: E0320 16:04:25.584727 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:26.084720045 +0000 UTC m=+220.759056760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.586660 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-config\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.588652 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-client-ca\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.589021 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-proxy-ca-bundles\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.592008 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e7a051-112d-4679-821d-75c5bbe8b51f-serving-cert\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.651468 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh5m8\" (UniqueName: \"kubernetes.io/projected/d1e7a051-112d-4679-821d-75c5bbe8b51f-kube-api-access-zh5m8\") pod \"controller-manager-5d7fb5f6bc-bqtpb\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.688238 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:25 crc kubenswrapper[4708]: E0320 16:04:25.688819 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:26.188803436 +0000 UTC m=+220.863140151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.690834 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vzncs"] Mar 20 16:04:25 crc kubenswrapper[4708]: E0320 16:04:25.691060 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d77bdf-8222-4072-bd4b-b766e73992cc" containerName="route-controller-manager" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.691077 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d77bdf-8222-4072-bd4b-b766e73992cc" containerName="route-controller-manager" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.691194 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="40d77bdf-8222-4072-bd4b-b766e73992cc" containerName="route-controller-manager" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.691830 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.696240 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.722043 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" event={"ID":"9f89383d-9d8a-4355-95d2-79081dda4a71","Type":"ContainerStarted","Data":"d8c950aa85e9c0e433ea730fd42575c9f79b2d0b53e89c0e500198c2df40c0cd"} Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.738488 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vzncs"] Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.762934 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.790288 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-catalog-content\") pod \"community-operators-vzncs\" (UID: \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\") " pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.790630 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.790710 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-utilities\") pod \"community-operators-vzncs\" (UID: \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\") " pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.790733 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2vz6\" (UniqueName: \"kubernetes.io/projected/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-kube-api-access-n2vz6\") pod \"community-operators-vzncs\" (UID: \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\") " pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.808099 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" event={"ID":"50baa880-1d72-48c6-b370-2f0094a30f23","Type":"ContainerStarted","Data":"f64888a32621d0385bb2d6fb139e6fd5bd61ca83197563949ef5ff483cd8a201"} Mar 20 16:04:25 crc kubenswrapper[4708]: E0320 16:04:25.809248 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:26.30921006 +0000 UTC m=+220.983546775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.832297 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xfcd6" event={"ID":"2f0d60b9-4e7c-4ee8-a7f9-93f217c81603","Type":"ContainerStarted","Data":"1b5cce82b7ef815b27e6cd4e0d22ffcc62ff62544c27706221eb1c290817e1f6"} Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.860586 4708 patch_prober.go:28] interesting pod/router-default-5444994796-8rllx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:04:25 crc kubenswrapper[4708]: [-]has-synced failed: reason withheld Mar 20 16:04:25 crc kubenswrapper[4708]: [+]process-running ok Mar 20 16:04:25 crc kubenswrapper[4708]: healthz check failed Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.860681 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8rllx" podUID="a22db040-c541-4ade-8099-899f3581d6c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.892576 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.892888 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-utilities\") pod \"community-operators-vzncs\" (UID: \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\") " pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.892940 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2vz6\" (UniqueName: \"kubernetes.io/projected/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-kube-api-access-n2vz6\") pod \"community-operators-vzncs\" (UID: \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\") " pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.892984 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-catalog-content\") pod \"community-operators-vzncs\" (UID: \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\") " pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:04:25 crc kubenswrapper[4708]: E0320 16:04:25.894111 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:26.394088546 +0000 UTC m=+221.068425261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.895494 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-utilities\") pod \"community-operators-vzncs\" (UID: \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\") " pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.895756 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-catalog-content\") pod \"community-operators-vzncs\" (UID: \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\") " pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.913171 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h4kcq"] Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.917851 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" podStartSLOduration=165.917810258 podStartE2EDuration="2m45.917810258s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:25.910222204 +0000 UTC m=+220.584558919" watchObservedRunningTime="2026-03-20 16:04:25.917810258 +0000 UTC m=+220.592146973" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.929379 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w48dx" event={"ID":"05d36f6d-3186-48d6-8388-b4c1b6e02bd2","Type":"ContainerStarted","Data":"c89360e002cd75a4caae7624f50948ece7956d1db6c5895979201d74c27352be"} Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.929511 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.946411 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.952954 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" event={"ID":"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7","Type":"ContainerStarted","Data":"74a8b9763e8602e2b20535d9ea99cfba067cce4383d20eeda50566ccf5265b27"} Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.968804 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h4kcq"] Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.970819 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2vz6\" (UniqueName: \"kubernetes.io/projected/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-kube-api-access-n2vz6\") pod \"community-operators-vzncs\" (UID: \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\") " pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.972427 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk" event={"ID":"aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6","Type":"ContainerStarted","Data":"7c1d1e52b8bf5205068f535cb0d89038f56c6339e60ec46fd1bd6265ddecfc5e"} Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.987721 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" event={"ID":"f3e5293a-140f-4740-adba-04f550109f8e","Type":"ContainerStarted","Data":"5eaf4ee033bd8bc365472ada4df25c8f43585dbc844cb747f3dfe3d08c9a520b"} Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.990869 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.994197 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trjvs" podStartSLOduration=164.994181884 podStartE2EDuration="2m44.994181884s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:25.989513751 +0000 UTC m=+220.663850466" watchObservedRunningTime="2026-03-20 16:04:25.994181884 +0000 UTC m=+220.668518599" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.994435 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca697a79-760d-4ae1-827f-bc2b0aee1785-utilities\") pod \"certified-operators-h4kcq\" (UID: \"ca697a79-760d-4ae1-827f-bc2b0aee1785\") " pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.994520 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh56j\" (UniqueName: \"kubernetes.io/projected/ca697a79-760d-4ae1-827f-bc2b0aee1785-kube-api-access-wh56j\") pod \"certified-operators-h4kcq\" (UID: \"ca697a79-760d-4ae1-827f-bc2b0aee1785\") " pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.994546 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca697a79-760d-4ae1-827f-bc2b0aee1785-catalog-content\") pod \"certified-operators-h4kcq\" (UID: \"ca697a79-760d-4ae1-827f-bc2b0aee1785\") " pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:04:25 crc kubenswrapper[4708]: I0320 16:04:25.994608 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:26 crc kubenswrapper[4708]: E0320 16:04:26.008049 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:26.507848472 +0000 UTC m=+221.182185187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.026083 4708 generic.go:334] "Generic (PLEG): container finished" podID="0619cccd-a201-4a73-9d07-01ccc4eb7c84" containerID="8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0" exitCode=0 Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.026197 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.026201 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" event={"ID":"0619cccd-a201-4a73-9d07-01ccc4eb7c84","Type":"ContainerDied","Data":"8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.026255 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j74qm" event={"ID":"0619cccd-a201-4a73-9d07-01ccc4eb7c84","Type":"ContainerDied","Data":"55ac97af6f02bc45e4b2092f6f217a91920ecbd914fc89f561973ac099d2ec7d"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.026272 4708 scope.go:117] "RemoveContainer" containerID="8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.034377 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p97v5" event={"ID":"c8619d97-0550-44ab-b54a-9bb0c275d6d0","Type":"ContainerStarted","Data":"d68fa9e63a00cf76bec8f4093d71f5f143320f78b153ed2252c2039c91978fd7"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.034632 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.093127 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pmd2p"] Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.097222 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.097574 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca697a79-760d-4ae1-827f-bc2b0aee1785-utilities\") pod \"certified-operators-h4kcq\" (UID: \"ca697a79-760d-4ae1-827f-bc2b0aee1785\") " pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.097645 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh56j\" (UniqueName: \"kubernetes.io/projected/ca697a79-760d-4ae1-827f-bc2b0aee1785-kube-api-access-wh56j\") pod \"certified-operators-h4kcq\" (UID: \"ca697a79-760d-4ae1-827f-bc2b0aee1785\") " pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.097682 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca697a79-760d-4ae1-827f-bc2b0aee1785-catalog-content\") pod \"certified-operators-h4kcq\" (UID: \"ca697a79-760d-4ae1-827f-bc2b0aee1785\") " pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.113831 4708 scope.go:117] "RemoveContainer" containerID="8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.115732 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca697a79-760d-4ae1-827f-bc2b0aee1785-utilities\") pod \"certified-operators-h4kcq\" (UID: \"ca697a79-760d-4ae1-827f-bc2b0aee1785\") " pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.116396 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca697a79-760d-4ae1-827f-bc2b0aee1785-catalog-content\") pod \"certified-operators-h4kcq\" (UID: \"ca697a79-760d-4ae1-827f-bc2b0aee1785\") " pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:04:26 crc kubenswrapper[4708]: E0320 16:04:26.118613 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:26.61856458 +0000 UTC m=+221.292901295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.124926 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:04:26 crc kubenswrapper[4708]: E0320 16:04:26.139896 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0\": container with ID starting with 8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0 not found: ID does not exist" containerID="8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.139945 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0"} err="failed to get container status \"8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0\": rpc error: code = NotFound desc = could not find container \"8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0\": container with ID starting with 8935743c45ddf8a6cac06fabaf39897f1bf8e717127be259d0f730d01d9ef2c0 not found: ID does not exist" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.142286 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qd9xk" podStartSLOduration=165.142266333 podStartE2EDuration="2m45.142266333s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:26.141608473 +0000 UTC m=+220.815945188" watchObservedRunningTime="2026-03-20 16:04:26.142266333 +0000 UTC m=+220.816603048" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.183308 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qcjhm" event={"ID":"c7bf1b0e-5792-415a-8c81-2dafe6019fac","Type":"ContainerStarted","Data":"57240ce7cd08287afee7f907da635c2333f5ce21f997fd2c3be5b695eccfb7bd"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.183351 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qcjhm" event={"ID":"c7bf1b0e-5792-415a-8c81-2dafe6019fac","Type":"ContainerStarted","Data":"3a94ee59cbe61f52b43d26af08481c10726b1c7721a7872b44d6c47a180420c1"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.183379 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.183817 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.183877 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.183965 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh56j\" (UniqueName: \"kubernetes.io/projected/ca697a79-760d-4ae1-827f-bc2b0aee1785-kube-api-access-wh56j\") pod \"certified-operators-h4kcq\" (UID: \"ca697a79-760d-4ae1-827f-bc2b0aee1785\") " pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.193535 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pmd2p"] Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.203466 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b626637b-4d54-4743-9717-142fe62e392b-utilities\") pod \"community-operators-pmd2p\" (UID: \"b626637b-4d54-4743-9717-142fe62e392b\") " pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.203547 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.203620 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r5tz\" (UniqueName: \"kubernetes.io/projected/b626637b-4d54-4743-9717-142fe62e392b-kube-api-access-7r5tz\") pod \"community-operators-pmd2p\" (UID: \"b626637b-4d54-4743-9717-142fe62e392b\") " pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.203651 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b626637b-4d54-4743-9717-142fe62e392b-catalog-content\") pod \"community-operators-pmd2p\" (UID: \"b626637b-4d54-4743-9717-142fe62e392b\") " pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:04:26 crc kubenswrapper[4708]: E0320 16:04:26.205315 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:26.7053037 +0000 UTC m=+221.379640415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.206256 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" podStartSLOduration=166.206245716 podStartE2EDuration="2m46.206245716s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:26.185747525 +0000 UTC m=+220.860084240" watchObservedRunningTime="2026-03-20 16:04:26.206245716 +0000 UTC m=+220.880582431" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.216352 4708 generic.go:334] "Generic (PLEG): container finished" podID="40d77bdf-8222-4072-bd4b-b766e73992cc" containerID="48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0" exitCode=0 Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.216442 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" event={"ID":"40d77bdf-8222-4072-bd4b-b766e73992cc","Type":"ContainerDied","Data":"48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.216469 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" event={"ID":"40d77bdf-8222-4072-bd4b-b766e73992cc","Type":"ContainerDied","Data":"b45798254d0a5a2f65a59a76d8589991ef97b8df32909d506d67468ea328edde"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.216489 4708 scope.go:117] "RemoveContainer" containerID="48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.216620 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.244694 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gd2q5" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.258252 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.261160 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mpv8d" podStartSLOduration=165.261136072 podStartE2EDuration="2m45.261136072s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:26.22647928 +0000 UTC m=+220.900815985" watchObservedRunningTime="2026-03-20 16:04:26.261136072 +0000 UTC m=+220.935472787" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.261866 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5frmt"] Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.262962 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.277870 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4hmnl" event={"ID":"69e3d1a5-5541-46de-ae32-4f5005a7a6c6","Type":"ContainerStarted","Data":"4d964f677f74db18ca98ded5f2ca24c0a60a2e9a1a3b260bdaed2910f922643e"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.295417 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mlg6n" event={"ID":"d4a86699-e0df-47a7-a7d6-50ad108ffaae","Type":"ContainerStarted","Data":"56dc1cb35da5cc5d0825d1f0ce344689004a13011e6fdf304c9dae31f7a407c0"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.311248 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.311564 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r5tz\" (UniqueName: \"kubernetes.io/projected/b626637b-4d54-4743-9717-142fe62e392b-kube-api-access-7r5tz\") pod \"community-operators-pmd2p\" (UID: \"b626637b-4d54-4743-9717-142fe62e392b\") " pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.311593 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc1f7ff-9725-4049-b06c-50a4adfa1696-catalog-content\") pod \"certified-operators-5frmt\" (UID: \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\") " pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.311636 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b626637b-4d54-4743-9717-142fe62e392b-catalog-content\") pod \"community-operators-pmd2p\" (UID: \"b626637b-4d54-4743-9717-142fe62e392b\") " pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.311686 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q884g\" (UniqueName: \"kubernetes.io/projected/cdc1f7ff-9725-4049-b06c-50a4adfa1696-kube-api-access-q884g\") pod \"certified-operators-5frmt\" (UID: \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\") " pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.311722 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc1f7ff-9725-4049-b06c-50a4adfa1696-utilities\") pod \"certified-operators-5frmt\" (UID: \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\") " pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.311744 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b626637b-4d54-4743-9717-142fe62e392b-utilities\") pod \"community-operators-pmd2p\" (UID: \"b626637b-4d54-4743-9717-142fe62e392b\") " pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.312189 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b626637b-4d54-4743-9717-142fe62e392b-utilities\") pod \"community-operators-pmd2p\" (UID: \"b626637b-4d54-4743-9717-142fe62e392b\") " pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:04:26 crc kubenswrapper[4708]: E0320 16:04:26.312203 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:26.812180689 +0000 UTC m=+221.486517415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.313217 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b626637b-4d54-4743-9717-142fe62e392b-catalog-content\") pod \"community-operators-pmd2p\" (UID: \"b626637b-4d54-4743-9717-142fe62e392b\") " pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.313271 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j74qm"] Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.316246 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j74qm"] Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.317958 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5frmt"] Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.324987 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" event={"ID":"dbc77181-1e16-4106-8433-da8e839d8275","Type":"ContainerStarted","Data":"15c06c3a0fd2b0249f354c98f871c4e8940fc9568eed9f1752d63ebf95145596"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.325040 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" event={"ID":"dbc77181-1e16-4106-8433-da8e839d8275","Type":"ContainerStarted","Data":"374ee7c8958ff9ece84129e139d7d1999683ab8977a89e7c6a069c453e33a71a"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.325751 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.375312 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" event={"ID":"4437a79c-97de-40ff-a2fa-d29cc2f86828","Type":"ContainerStarted","Data":"aaab20e4741bdb770e63c33a3ab8ab4cb220a2cba1042f4bb09df1578443eab7"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.376842 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.377139 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r5tz\" (UniqueName: \"kubernetes.io/projected/b626637b-4d54-4743-9717-142fe62e392b-kube-api-access-7r5tz\") pod \"community-operators-pmd2p\" (UID: \"b626637b-4d54-4743-9717-142fe62e392b\") " pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.401352 4708 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6h42j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.401413 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" podUID="4437a79c-97de-40ff-a2fa-d29cc2f86828" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.404141 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qcjhm" podStartSLOduration=9.404125147 podStartE2EDuration="9.404125147s" podCreationTimestamp="2026-03-20 16:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:26.401988066 +0000 UTC m=+221.076324771" watchObservedRunningTime="2026-03-20 16:04:26.404125147 +0000 UTC m=+221.078461862" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.410895 4708 scope.go:117] "RemoveContainer" containerID="48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.413325 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc1f7ff-9725-4049-b06c-50a4adfa1696-catalog-content\") pod \"certified-operators-5frmt\" (UID: \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\") " pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.428391 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q884g\" (UniqueName: \"kubernetes.io/projected/cdc1f7ff-9725-4049-b06c-50a4adfa1696-kube-api-access-q884g\") pod \"certified-operators-5frmt\" (UID: \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\") " pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.428493 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc1f7ff-9725-4049-b06c-50a4adfa1696-utilities\") pod \"certified-operators-5frmt\" (UID: \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\") " pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.428580 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.413793 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc1f7ff-9725-4049-b06c-50a4adfa1696-catalog-content\") pod \"certified-operators-5frmt\" (UID: \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\") " pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:04:26 crc kubenswrapper[4708]: E0320 16:04:26.434309 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:26.934293141 +0000 UTC m=+221.608629846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.419046 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sr2kb" event={"ID":"6183b8c1-817c-4778-8310-b0481ebcc004","Type":"ContainerStarted","Data":"e0d68e73b171dc945575a370f2a04685851bc36fe6855049290c1fce0d1b1d39"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.434649 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc1f7ff-9725-4049-b06c-50a4adfa1696-utilities\") pod \"certified-operators-5frmt\" (UID: \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\") " pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:04:26 crc kubenswrapper[4708]: E0320 16:04:26.469855 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0\": container with ID starting with 48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0 not found: ID does not exist" containerID="48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.469912 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0"} err="failed to get container status \"48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0\": rpc error: code = NotFound desc = could not find container \"48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0\": container with ID starting with 48c7a447559f9aaf0ff1418a20d8070a4e06b396fc52dea644ebfc25b5b759b0 not found: ID does not exist" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.475112 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" event={"ID":"fa1343fa-debb-4834-a679-e82cec21dfda","Type":"ContainerStarted","Data":"92b8e2984cf3eb088645df30545985a14bc44876851797e72a721b0eab9ba956"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.479693 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p97v5" podStartSLOduration=9.479658677 podStartE2EDuration="9.479658677s" podCreationTimestamp="2026-03-20 16:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:26.473091072 +0000 UTC m=+221.147427787" watchObservedRunningTime="2026-03-20 16:04:26.479658677 +0000 UTC m=+221.153995382" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.493712 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.497243 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q884g\" (UniqueName: \"kubernetes.io/projected/cdc1f7ff-9725-4049-b06c-50a4adfa1696-kube-api-access-q884g\") pod \"certified-operators-5frmt\" (UID: \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\") " pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.534645 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb"] Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.535399 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:26 crc kubenswrapper[4708]: E0320 16:04:26.536582 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:27.036566611 +0000 UTC m=+221.710903326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.537944 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.558777 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" event={"ID":"c2c9b8a5-ad39-4793-b85e-6282872b25f6","Type":"ContainerStarted","Data":"5cbe93b697ff05f32a2af9ec74a18e00f92ceb14a7db827954e122c51ceae56d"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.564792 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" event={"ID":"c2c9b8a5-ad39-4793-b85e-6282872b25f6","Type":"ContainerStarted","Data":"3e6efe79fc78994eacc3539c2e58bfdf7586e5bad598e874dc7e588af5991aa6"} Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.589211 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8bbz9" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.613095 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pz6p4" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.632135 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" podStartSLOduration=165.63211483 podStartE2EDuration="2m45.63211483s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:26.631226615 +0000 UTC m=+221.305563330" watchObservedRunningTime="2026-03-20 16:04:26.63211483 +0000 UTC m=+221.306451545" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.632337 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-46dcz" podStartSLOduration=166.632333377 podStartE2EDuration="2m46.632333377s" podCreationTimestamp="2026-03-20 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:26.589128432 +0000 UTC m=+221.263465147" watchObservedRunningTime="2026-03-20 16:04:26.632333377 +0000 UTC m=+221.306670082" Mar 20 16:04:26 crc kubenswrapper[4708]: W0320 16:04:26.636791 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1e7a051_112d_4679_821d_75c5bbe8b51f.slice/crio-b2790354ffa080fb818d2c7c21a241535103e5673e3659dd4ab1e0d01c6b14d7 WatchSource:0}: Error finding container b2790354ffa080fb818d2c7c21a241535103e5673e3659dd4ab1e0d01c6b14d7: Status 404 returned error can't find the container with id b2790354ffa080fb818d2c7c21a241535103e5673e3659dd4ab1e0d01c6b14d7 Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.638152 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:26 crc kubenswrapper[4708]: E0320 16:04:26.648459 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:27.148441173 +0000 UTC m=+221.822777888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.666091 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.675142 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7"] Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.693375 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r28l7"] Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.707107 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" podStartSLOduration=165.707079195 podStartE2EDuration="2m45.707079195s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:26.703931716 +0000 UTC m=+221.378268431" watchObservedRunningTime="2026-03-20 16:04:26.707079195 +0000 UTC m=+221.381415900" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.715020 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.743511 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:26 crc kubenswrapper[4708]: E0320 16:04:26.744134 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:27.244111705 +0000 UTC m=+221.918448420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.800143 4708 ???:1] "http: TLS handshake error from 192.168.126.11:43188: no serving certificate available for the kubelet" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.812612 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-4hmnl" podStartSLOduration=165.812595906 podStartE2EDuration="2m45.812595906s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:26.810784615 +0000 UTC m=+221.485121320" watchObservedRunningTime="2026-03-20 16:04:26.812595906 +0000 UTC m=+221.486932621" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.837070 4708 patch_prober.go:28] interesting pod/router-default-5444994796-8rllx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:04:26 crc kubenswrapper[4708]: [-]has-synced failed: reason withheld Mar 20 16:04:26 crc kubenswrapper[4708]: [+]process-running ok Mar 20 16:04:26 crc kubenswrapper[4708]: healthz check failed Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.837478 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8rllx" podUID="a22db040-c541-4ade-8099-899f3581d6c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.859998 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:26 crc kubenswrapper[4708]: E0320 16:04:26.860808 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:27.360792484 +0000 UTC m=+222.035129199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.867732 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jfntr" podStartSLOduration=165.86770646 podStartE2EDuration="2m45.86770646s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:26.844052819 +0000 UTC m=+221.518389534" watchObservedRunningTime="2026-03-20 16:04:26.86770646 +0000 UTC m=+221.542043175" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.885893 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vzncs"] Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.939196 4708 ???:1] "http: TLS handshake error from 192.168.126.11:43194: no serving certificate available for the kubelet" Mar 20 16:04:26 crc kubenswrapper[4708]: I0320 16:04:26.964032 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:26 crc kubenswrapper[4708]: E0320 16:04:26.964468 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:27.464454502 +0000 UTC m=+222.138791217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:26 crc kubenswrapper[4708]: W0320 16:04:26.969175 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d8b3c28_42d0_479b_b45d_0fe00b8cb36d.slice/crio-4dde2d6ed803822ae4482f52062a9d09c8974be71c65867d65b589b43ea7e430 WatchSource:0}: Error finding container 4dde2d6ed803822ae4482f52062a9d09c8974be71c65867d65b589b43ea7e430: Status 404 returned error can't find the container with id 4dde2d6ed803822ae4482f52062a9d09c8974be71c65867d65b589b43ea7e430 Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.065844 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:27 crc kubenswrapper[4708]: E0320 16:04:27.066250 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:27.566235487 +0000 UTC m=+222.240572192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.168275 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:27 crc kubenswrapper[4708]: E0320 16:04:27.169285 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:27.669264419 +0000 UTC m=+222.343601134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.211768 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h4kcq"] Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.212786 4708 ???:1] "http: TLS handshake error from 192.168.126.11:43210: no serving certificate available for the kubelet" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.270893 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:27 crc kubenswrapper[4708]: E0320 16:04:27.271456 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:27.771437145 +0000 UTC m=+222.445773860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.342023 4708 ???:1] "http: TLS handshake error from 192.168.126.11:43222: no serving certificate available for the kubelet" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.354530 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5frmt"] Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.388627 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:27 crc kubenswrapper[4708]: E0320 16:04:27.389097 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:27.889073141 +0000 UTC m=+222.563409856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:27 crc kubenswrapper[4708]: W0320 16:04:27.395821 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdc1f7ff_9725_4049_b06c_50a4adfa1696.slice/crio-3dde5d1371fd794faf47f04e702da9c61ec0f78b103cc37bbeb2edadda21321b WatchSource:0}: Error finding container 3dde5d1371fd794faf47f04e702da9c61ec0f78b103cc37bbeb2edadda21321b: Status 404 returned error can't find the container with id 3dde5d1371fd794faf47f04e702da9c61ec0f78b103cc37bbeb2edadda21321b Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.489632 4708 ???:1] "http: TLS handshake error from 192.168.126.11:43238: no serving certificate available for the kubelet" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.490605 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:27 crc kubenswrapper[4708]: E0320 16:04:27.491026 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:27.991003111 +0000 UTC m=+222.665339826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.550093 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pmd2p"] Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.592343 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:27 crc kubenswrapper[4708]: E0320 16:04:27.592818 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:28.092800997 +0000 UTC m=+222.767137712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.601396 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frmt" event={"ID":"cdc1f7ff-9725-4049-b06c-50a4adfa1696","Type":"ContainerStarted","Data":"3dde5d1371fd794faf47f04e702da9c61ec0f78b103cc37bbeb2edadda21321b"} Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.608798 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4kcq" event={"ID":"ca697a79-760d-4ae1-827f-bc2b0aee1785","Type":"ContainerStarted","Data":"1787afc0948237b24897fb97632cc654ac801b39ec4592412ad028488155510f"} Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.614032 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzncs" event={"ID":"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d","Type":"ContainerStarted","Data":"626cdf56fe1ed0f33f343354174d8af818cbe2b8508d6469de0ad05fa22ee82e"} Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.614076 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzncs" event={"ID":"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d","Type":"ContainerStarted","Data":"4dde2d6ed803822ae4482f52062a9d09c8974be71c65867d65b589b43ea7e430"} Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.642761 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hvrxr"] Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.644710 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" event={"ID":"d1e7a051-112d-4679-821d-75c5bbe8b51f","Type":"ContainerStarted","Data":"53f3d89a423eb8e436cd659ad1e256eae8d088f075b78743161bd887ead38cf1"} Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.644743 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" event={"ID":"d1e7a051-112d-4679-821d-75c5bbe8b51f","Type":"ContainerStarted","Data":"b2790354ffa080fb818d2c7c21a241535103e5673e3659dd4ab1e0d01c6b14d7"} Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.644833 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.648471 4708 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6h42j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.648520 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" podUID="4437a79c-97de-40ff-a2fa-d29cc2f86828" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.648604 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.658561 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.662294 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.662611 4708 ???:1] "http: TLS handshake error from 192.168.126.11:43252: no serving certificate available for the kubelet" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.681612 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hvrxr"] Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.694639 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.697838 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9mnp\" (UniqueName: \"kubernetes.io/projected/bece7d1b-b5d8-4762-b0b8-b2752c422776-kube-api-access-m9mnp\") pod \"redhat-marketplace-hvrxr\" (UID: \"bece7d1b-b5d8-4762-b0b8-b2752c422776\") " pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.698211 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bece7d1b-b5d8-4762-b0b8-b2752c422776-utilities\") pod \"redhat-marketplace-hvrxr\" (UID: \"bece7d1b-b5d8-4762-b0b8-b2752c422776\") " pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.698229 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bece7d1b-b5d8-4762-b0b8-b2752c422776-catalog-content\") pod \"redhat-marketplace-hvrxr\" (UID: \"bece7d1b-b5d8-4762-b0b8-b2752c422776\") " pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:04:27 crc kubenswrapper[4708]: E0320 16:04:27.699516 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:28.199497842 +0000 UTC m=+222.873834547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.723610 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" podStartSLOduration=6.7235857150000005 podStartE2EDuration="6.723585715s" podCreationTimestamp="2026-03-20 16:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:27.693902663 +0000 UTC m=+222.368239378" watchObservedRunningTime="2026-03-20 16:04:27.723585715 +0000 UTC m=+222.397922430" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.801871 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.802930 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9mnp\" (UniqueName: \"kubernetes.io/projected/bece7d1b-b5d8-4762-b0b8-b2752c422776-kube-api-access-m9mnp\") pod \"redhat-marketplace-hvrxr\" (UID: \"bece7d1b-b5d8-4762-b0b8-b2752c422776\") " pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.803011 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bece7d1b-b5d8-4762-b0b8-b2752c422776-utilities\") pod \"redhat-marketplace-hvrxr\" (UID: \"bece7d1b-b5d8-4762-b0b8-b2752c422776\") " pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.803031 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bece7d1b-b5d8-4762-b0b8-b2752c422776-catalog-content\") pod \"redhat-marketplace-hvrxr\" (UID: \"bece7d1b-b5d8-4762-b0b8-b2752c422776\") " pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.803528 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bece7d1b-b5d8-4762-b0b8-b2752c422776-catalog-content\") pod \"redhat-marketplace-hvrxr\" (UID: \"bece7d1b-b5d8-4762-b0b8-b2752c422776\") " pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:04:27 crc kubenswrapper[4708]: E0320 16:04:27.803958 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:28.303941882 +0000 UTC m=+222.978278597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.803998 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bece7d1b-b5d8-4762-b0b8-b2752c422776-utilities\") pod \"redhat-marketplace-hvrxr\" (UID: \"bece7d1b-b5d8-4762-b0b8-b2752c422776\") " pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.829480 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9mnp\" (UniqueName: \"kubernetes.io/projected/bece7d1b-b5d8-4762-b0b8-b2752c422776-kube-api-access-m9mnp\") pod \"redhat-marketplace-hvrxr\" (UID: \"bece7d1b-b5d8-4762-b0b8-b2752c422776\") " pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.839248 4708 patch_prober.go:28] interesting pod/router-default-5444994796-8rllx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:04:27 crc kubenswrapper[4708]: [-]has-synced failed: reason withheld Mar 20 16:04:27 crc kubenswrapper[4708]: [+]process-running ok Mar 20 16:04:27 crc kubenswrapper[4708]: healthz check failed Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.839589 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8rllx" podUID="a22db040-c541-4ade-8099-899f3581d6c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.894954 4708 ???:1] "http: TLS handshake error from 192.168.126.11:43260: no serving certificate available for the kubelet" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.905529 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt"] Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.908602 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.910214 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:27 crc kubenswrapper[4708]: E0320 16:04:27.910749 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:28.410729611 +0000 UTC m=+223.085066326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.911010 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.911090 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.919155 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.920542 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.920849 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.921960 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.930815 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt"] Mar 20 16:04:27 crc kubenswrapper[4708]: I0320 16:04:27.991832 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.012347 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:28 crc kubenswrapper[4708]: E0320 16:04:28.012710 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:28.51268275 +0000 UTC m=+223.187019465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.013529 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx8lr\" (UniqueName: \"kubernetes.io/projected/fcffd92a-6284-4e83-af6a-e49abbdd7387-kube-api-access-cx8lr\") pod \"route-controller-manager-7bd4f69545-56vpt\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.013611 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcffd92a-6284-4e83-af6a-e49abbdd7387-serving-cert\") pod \"route-controller-manager-7bd4f69545-56vpt\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.013639 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.013723 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcffd92a-6284-4e83-af6a-e49abbdd7387-client-ca\") pod \"route-controller-manager-7bd4f69545-56vpt\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.013743 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcffd92a-6284-4e83-af6a-e49abbdd7387-config\") pod \"route-controller-manager-7bd4f69545-56vpt\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: E0320 16:04:28.014058 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:28.514045909 +0000 UTC m=+223.188382624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.022612 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6675d"] Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.024394 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.039533 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6675d"] Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.088328 4708 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.114442 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.114713 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhc5p\" (UniqueName: \"kubernetes.io/projected/b7a193fd-f6c4-4779-ba32-d746f05094c1-kube-api-access-hhc5p\") pod \"redhat-marketplace-6675d\" (UID: \"b7a193fd-f6c4-4779-ba32-d746f05094c1\") " pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.114778 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcffd92a-6284-4e83-af6a-e49abbdd7387-serving-cert\") pod \"route-controller-manager-7bd4f69545-56vpt\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.114804 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a193fd-f6c4-4779-ba32-d746f05094c1-catalog-content\") pod \"redhat-marketplace-6675d\" (UID: \"b7a193fd-f6c4-4779-ba32-d746f05094c1\") " pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.114852 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a193fd-f6c4-4779-ba32-d746f05094c1-utilities\") pod \"redhat-marketplace-6675d\" (UID: \"b7a193fd-f6c4-4779-ba32-d746f05094c1\") " pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.114896 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcffd92a-6284-4e83-af6a-e49abbdd7387-client-ca\") pod \"route-controller-manager-7bd4f69545-56vpt\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.114915 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcffd92a-6284-4e83-af6a-e49abbdd7387-config\") pod \"route-controller-manager-7bd4f69545-56vpt\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.114934 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8lr\" (UniqueName: \"kubernetes.io/projected/fcffd92a-6284-4e83-af6a-e49abbdd7387-kube-api-access-cx8lr\") pod \"route-controller-manager-7bd4f69545-56vpt\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: E0320 16:04:28.115306 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:28.615289 +0000 UTC m=+223.289625715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.117013 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcffd92a-6284-4e83-af6a-e49abbdd7387-client-ca\") pod \"route-controller-manager-7bd4f69545-56vpt\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.118427 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcffd92a-6284-4e83-af6a-e49abbdd7387-config\") pod \"route-controller-manager-7bd4f69545-56vpt\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.123021 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcffd92a-6284-4e83-af6a-e49abbdd7387-serving-cert\") pod \"route-controller-manager-7bd4f69545-56vpt\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.132488 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0619cccd-a201-4a73-9d07-01ccc4eb7c84" path="/var/lib/kubelet/pods/0619cccd-a201-4a73-9d07-01ccc4eb7c84/volumes" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.133024 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40d77bdf-8222-4072-bd4b-b766e73992cc" path="/var/lib/kubelet/pods/40d77bdf-8222-4072-bd4b-b766e73992cc/volumes" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.141802 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx8lr\" (UniqueName: \"kubernetes.io/projected/fcffd92a-6284-4e83-af6a-e49abbdd7387-kube-api-access-cx8lr\") pod \"route-controller-manager-7bd4f69545-56vpt\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.217243 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhc5p\" (UniqueName: \"kubernetes.io/projected/b7a193fd-f6c4-4779-ba32-d746f05094c1-kube-api-access-hhc5p\") pod \"redhat-marketplace-6675d\" (UID: \"b7a193fd-f6c4-4779-ba32-d746f05094c1\") " pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.219713 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a193fd-f6c4-4779-ba32-d746f05094c1-catalog-content\") pod \"redhat-marketplace-6675d\" (UID: \"b7a193fd-f6c4-4779-ba32-d746f05094c1\") " pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.219762 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.219815 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a193fd-f6c4-4779-ba32-d746f05094c1-utilities\") pod \"redhat-marketplace-6675d\" (UID: \"b7a193fd-f6c4-4779-ba32-d746f05094c1\") " pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.220555 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a193fd-f6c4-4779-ba32-d746f05094c1-catalog-content\") pod \"redhat-marketplace-6675d\" (UID: \"b7a193fd-f6c4-4779-ba32-d746f05094c1\") " pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.221031 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a193fd-f6c4-4779-ba32-d746f05094c1-utilities\") pod \"redhat-marketplace-6675d\" (UID: \"b7a193fd-f6c4-4779-ba32-d746f05094c1\") " pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:04:28 crc kubenswrapper[4708]: E0320 16:04:28.221811 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:28.72179888 +0000 UTC m=+223.396135595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.256269 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhc5p\" (UniqueName: \"kubernetes.io/projected/b7a193fd-f6c4-4779-ba32-d746f05094c1-kube-api-access-hhc5p\") pod \"redhat-marketplace-6675d\" (UID: \"b7a193fd-f6c4-4779-ba32-d746f05094c1\") " pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.273354 4708 ???:1] "http: TLS handshake error from 192.168.126.11:43262: no serving certificate available for the kubelet" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.290937 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.321883 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:28 crc kubenswrapper[4708]: E0320 16:04:28.322600 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 16:04:28.822584457 +0000 UTC m=+223.496921172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.359847 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hvrxr"] Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.366710 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.391544 4708 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T16:04:28.088353316Z","Handler":null,"Name":""} Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.424405 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:28 crc kubenswrapper[4708]: E0320 16:04:28.424813 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 16:04:28.924800935 +0000 UTC m=+223.599137640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tkdfr" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.428765 4708 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.428804 4708 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.525400 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.530090 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.628083 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.630704 4708 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.630727 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.670292 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w48dx" event={"ID":"05d36f6d-3186-48d6-8388-b4c1b6e02bd2","Type":"ContainerStarted","Data":"09fcbc15fe6ca0f83924fe198f618ffa9591ed6ea53f0729e23e7d3b914f1156"} Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.670340 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w48dx" event={"ID":"05d36f6d-3186-48d6-8388-b4c1b6e02bd2","Type":"ContainerStarted","Data":"5e8d58db0c6649566c1a4ce8d94013f430125cb1fde3a58d95157c0c7cba645a"} Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.674505 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvrxr" event={"ID":"bece7d1b-b5d8-4762-b0b8-b2752c422776","Type":"ContainerStarted","Data":"04c15eb141f571ab55102f7edba5231c4e5bc5373fc6b2007d86d88066098acb"} Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.679796 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tkdfr\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.681375 4708 generic.go:334] "Generic (PLEG): container finished" podID="ca697a79-760d-4ae1-827f-bc2b0aee1785" containerID="60a31673c741948f3ed4bfb1fc52b9f3930b2cf6b42784e5a9ea24652e9f351b" exitCode=0 Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.681418 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4kcq" event={"ID":"ca697a79-760d-4ae1-827f-bc2b0aee1785","Type":"ContainerDied","Data":"60a31673c741948f3ed4bfb1fc52b9f3930b2cf6b42784e5a9ea24652e9f351b"} Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.687693 4708 generic.go:334] "Generic (PLEG): container finished" podID="b626637b-4d54-4743-9717-142fe62e392b" containerID="02511d17c83c93db7f2fc5999ae295f0d820ac98422957ce98283936f48d2790" exitCode=0 Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.687766 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmd2p" event={"ID":"b626637b-4d54-4743-9717-142fe62e392b","Type":"ContainerDied","Data":"02511d17c83c93db7f2fc5999ae295f0d820ac98422957ce98283936f48d2790"} Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.687797 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmd2p" event={"ID":"b626637b-4d54-4743-9717-142fe62e392b","Type":"ContainerStarted","Data":"200213f14b30740421ecd60e2c09f628bb75ea8cc90b5ef3dc9de6094b7cf587"} Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.699043 4708 generic.go:334] "Generic (PLEG): container finished" podID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" containerID="626cdf56fe1ed0f33f343354174d8af818cbe2b8508d6469de0ad05fa22ee82e" exitCode=0 Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.699111 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzncs" event={"ID":"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d","Type":"ContainerDied","Data":"626cdf56fe1ed0f33f343354174d8af818cbe2b8508d6469de0ad05fa22ee82e"} Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.710210 4708 generic.go:334] "Generic (PLEG): container finished" podID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" containerID="94e31968168b89b150e227267b7a7d71fc47007422c9b1e242d8e8a44f723923" exitCode=0 Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.710423 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frmt" event={"ID":"cdc1f7ff-9725-4049-b06c-50a4adfa1696","Type":"ContainerDied","Data":"94e31968168b89b150e227267b7a7d71fc47007422c9b1e242d8e8a44f723923"} Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.717304 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.717982 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6675d"] Mar 20 16:04:28 crc kubenswrapper[4708]: W0320 16:04:28.758445 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7a193fd_f6c4_4779_ba32_d746f05094c1.slice/crio-ce1a7b356f057c3847de69a10db943fdeedf69ff04ff8f0f11676509a6b08dd8 WatchSource:0}: Error finding container ce1a7b356f057c3847de69a10db943fdeedf69ff04ff8f0f11676509a6b08dd8: Status 404 returned error can't find the container with id ce1a7b356f057c3847de69a10db943fdeedf69ff04ff8f0f11676509a6b08dd8 Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.841276 4708 patch_prober.go:28] interesting pod/router-default-5444994796-8rllx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:04:28 crc kubenswrapper[4708]: [-]has-synced failed: reason withheld Mar 20 16:04:28 crc kubenswrapper[4708]: [+]process-running ok Mar 20 16:04:28 crc kubenswrapper[4708]: healthz check failed Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.841364 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8rllx" podUID="a22db040-c541-4ade-8099-899f3581d6c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.911729 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:28 crc kubenswrapper[4708]: I0320 16:04:28.939408 4708 ???:1] "http: TLS handshake error from 192.168.126.11:43278: no serving certificate available for the kubelet" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.033042 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dk7n9"] Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.037817 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.055430 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.067342 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dk7n9"] Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.087118 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt"] Mar 20 16:04:29 crc kubenswrapper[4708]: W0320 16:04:29.122695 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcffd92a_6284_4e83_af6a_e49abbdd7387.slice/crio-9a56bbe2bd047aa5da74bc46e371cb2e8bc7b78ccd242a555094ef43033354e4 WatchSource:0}: Error finding container 9a56bbe2bd047aa5da74bc46e371cb2e8bc7b78ccd242a555094ef43033354e4: Status 404 returned error can't find the container with id 9a56bbe2bd047aa5da74bc46e371cb2e8bc7b78ccd242a555094ef43033354e4 Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.143091 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwtbf\" (UniqueName: \"kubernetes.io/projected/65dd12f7-e1af-4213-b264-d846c01eaba8-kube-api-access-bwtbf\") pod \"redhat-operators-dk7n9\" (UID: \"65dd12f7-e1af-4213-b264-d846c01eaba8\") " pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.143195 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dd12f7-e1af-4213-b264-d846c01eaba8-utilities\") pod \"redhat-operators-dk7n9\" (UID: \"65dd12f7-e1af-4213-b264-d846c01eaba8\") " pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.143245 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dd12f7-e1af-4213-b264-d846c01eaba8-catalog-content\") pod \"redhat-operators-dk7n9\" (UID: \"65dd12f7-e1af-4213-b264-d846c01eaba8\") " pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.211112 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tkdfr"] Mar 20 16:04:29 crc kubenswrapper[4708]: W0320 16:04:29.219944 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod761e5144_aa8a_4203_b166_b5dc638bfe79.slice/crio-d32517300d2fd960512c82f32c7b881581115340f83df58b03321fdfca63a33e WatchSource:0}: Error finding container d32517300d2fd960512c82f32c7b881581115340f83df58b03321fdfca63a33e: Status 404 returned error can't find the container with id d32517300d2fd960512c82f32c7b881581115340f83df58b03321fdfca63a33e Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.246421 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dd12f7-e1af-4213-b264-d846c01eaba8-utilities\") pod \"redhat-operators-dk7n9\" (UID: \"65dd12f7-e1af-4213-b264-d846c01eaba8\") " pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.246839 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dd12f7-e1af-4213-b264-d846c01eaba8-catalog-content\") pod \"redhat-operators-dk7n9\" (UID: \"65dd12f7-e1af-4213-b264-d846c01eaba8\") " pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.246886 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwtbf\" (UniqueName: \"kubernetes.io/projected/65dd12f7-e1af-4213-b264-d846c01eaba8-kube-api-access-bwtbf\") pod \"redhat-operators-dk7n9\" (UID: \"65dd12f7-e1af-4213-b264-d846c01eaba8\") " pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.247604 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dd12f7-e1af-4213-b264-d846c01eaba8-utilities\") pod \"redhat-operators-dk7n9\" (UID: \"65dd12f7-e1af-4213-b264-d846c01eaba8\") " pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.247841 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dd12f7-e1af-4213-b264-d846c01eaba8-catalog-content\") pod \"redhat-operators-dk7n9\" (UID: \"65dd12f7-e1af-4213-b264-d846c01eaba8\") " pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.271428 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwtbf\" (UniqueName: \"kubernetes.io/projected/65dd12f7-e1af-4213-b264-d846c01eaba8-kube-api-access-bwtbf\") pod \"redhat-operators-dk7n9\" (UID: \"65dd12f7-e1af-4213-b264-d846c01eaba8\") " pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.382837 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.444009 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mr9ct"] Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.445809 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.446534 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mr9ct"] Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.552968 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18d7f096-fc87-4f68-959e-5ab803b7e097-utilities\") pod \"redhat-operators-mr9ct\" (UID: \"18d7f096-fc87-4f68-959e-5ab803b7e097\") " pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.553052 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18d7f096-fc87-4f68-959e-5ab803b7e097-catalog-content\") pod \"redhat-operators-mr9ct\" (UID: \"18d7f096-fc87-4f68-959e-5ab803b7e097\") " pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.553136 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvqzm\" (UniqueName: \"kubernetes.io/projected/18d7f096-fc87-4f68-959e-5ab803b7e097-kube-api-access-xvqzm\") pod \"redhat-operators-mr9ct\" (UID: \"18d7f096-fc87-4f68-959e-5ab803b7e097\") " pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.654512 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18d7f096-fc87-4f68-959e-5ab803b7e097-catalog-content\") pod \"redhat-operators-mr9ct\" (UID: \"18d7f096-fc87-4f68-959e-5ab803b7e097\") " pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.654607 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvqzm\" (UniqueName: \"kubernetes.io/projected/18d7f096-fc87-4f68-959e-5ab803b7e097-kube-api-access-xvqzm\") pod \"redhat-operators-mr9ct\" (UID: \"18d7f096-fc87-4f68-959e-5ab803b7e097\") " pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.654655 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18d7f096-fc87-4f68-959e-5ab803b7e097-utilities\") pod \"redhat-operators-mr9ct\" (UID: \"18d7f096-fc87-4f68-959e-5ab803b7e097\") " pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.655783 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18d7f096-fc87-4f68-959e-5ab803b7e097-utilities\") pod \"redhat-operators-mr9ct\" (UID: \"18d7f096-fc87-4f68-959e-5ab803b7e097\") " pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.656014 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18d7f096-fc87-4f68-959e-5ab803b7e097-catalog-content\") pod \"redhat-operators-mr9ct\" (UID: \"18d7f096-fc87-4f68-959e-5ab803b7e097\") " pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.687929 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvqzm\" (UniqueName: \"kubernetes.io/projected/18d7f096-fc87-4f68-959e-5ab803b7e097-kube-api-access-xvqzm\") pod \"redhat-operators-mr9ct\" (UID: \"18d7f096-fc87-4f68-959e-5ab803b7e097\") " pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.758172 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w48dx" event={"ID":"05d36f6d-3186-48d6-8388-b4c1b6e02bd2","Type":"ContainerStarted","Data":"8347478db8bfbf5208361bd4944329dc48fb094f18fd861cc5d98ac91047c433"} Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.771449 4708 generic.go:334] "Generic (PLEG): container finished" podID="1acefb60-c1cd-4b60-8a16-c99c44f4d8c7" containerID="74a8b9763e8602e2b20535d9ea99cfba067cce4383d20eeda50566ccf5265b27" exitCode=0 Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.771503 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" event={"ID":"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7","Type":"ContainerDied","Data":"74a8b9763e8602e2b20535d9ea99cfba067cce4383d20eeda50566ccf5265b27"} Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.778007 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.781841 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" event={"ID":"fcffd92a-6284-4e83-af6a-e49abbdd7387","Type":"ContainerStarted","Data":"10fe4a6a817002218175e35983cbfa45d08229ac9b4ac86fa53d04ea5536859c"} Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.781874 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" event={"ID":"fcffd92a-6284-4e83-af6a-e49abbdd7387","Type":"ContainerStarted","Data":"9a56bbe2bd047aa5da74bc46e371cb2e8bc7b78ccd242a555094ef43033354e4"} Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.782831 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.798373 4708 generic.go:334] "Generic (PLEG): container finished" podID="bece7d1b-b5d8-4762-b0b8-b2752c422776" containerID="04d6f12446a266105a35664aef9b5a145b0216aa077bdd34a3b73490ec0e4c6c" exitCode=0 Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.798447 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvrxr" event={"ID":"bece7d1b-b5d8-4762-b0b8-b2752c422776","Type":"ContainerDied","Data":"04d6f12446a266105a35664aef9b5a145b0216aa077bdd34a3b73490ec0e4c6c"} Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.813102 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-w48dx" podStartSLOduration=12.813083444 podStartE2EDuration="12.813083444s" podCreationTimestamp="2026-03-20 16:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:29.81012986 +0000 UTC m=+224.484466575" watchObservedRunningTime="2026-03-20 16:04:29.813083444 +0000 UTC m=+224.487420159" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.834559 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.834842 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.835973 4708 patch_prober.go:28] interesting pod/console-f9d7485db-kdrms container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.836019 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-kdrms" podUID="77884518-e4d9-4a61-b8fb-55b1e2f9e23a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.837022 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" event={"ID":"761e5144-aa8a-4203-b166-b5dc638bfe79","Type":"ContainerStarted","Data":"9f4f407d45537d007361fa80c22a4ea8aca3635eda6b8969ca358f4415d40165"} Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.837059 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" event={"ID":"761e5144-aa8a-4203-b166-b5dc638bfe79","Type":"ContainerStarted","Data":"d32517300d2fd960512c82f32c7b881581115340f83df58b03321fdfca63a33e"} Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.837840 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.846854 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.853323 4708 patch_prober.go:28] interesting pod/router-default-5444994796-8rllx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:04:29 crc kubenswrapper[4708]: [-]has-synced failed: reason withheld Mar 20 16:04:29 crc kubenswrapper[4708]: [+]process-running ok Mar 20 16:04:29 crc kubenswrapper[4708]: healthz check failed Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.853376 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8rllx" podUID="a22db040-c541-4ade-8099-899f3581d6c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.885385 4708 generic.go:334] "Generic (PLEG): container finished" podID="b7a193fd-f6c4-4779-ba32-d746f05094c1" containerID="675abf9754bd968001e9c24811fd424b4d02b0f8c0364bbf8169646da84a493d" exitCode=0 Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.886546 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6675d" event={"ID":"b7a193fd-f6c4-4779-ba32-d746f05094c1","Type":"ContainerDied","Data":"675abf9754bd968001e9c24811fd424b4d02b0f8c0364bbf8169646da84a493d"} Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.886652 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6675d" event={"ID":"b7a193fd-f6c4-4779-ba32-d746f05094c1","Type":"ContainerStarted","Data":"ce1a7b356f057c3847de69a10db943fdeedf69ff04ff8f0f11676509a6b08dd8"} Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.905026 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" podStartSLOduration=8.90500887 podStartE2EDuration="8.90500887s" podCreationTimestamp="2026-03-20 16:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:29.894950595 +0000 UTC m=+224.569287310" watchObservedRunningTime="2026-03-20 16:04:29.90500887 +0000 UTC m=+224.579345585" Mar 20 16:04:29 crc kubenswrapper[4708]: I0320 16:04:29.909571 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dk7n9"] Mar 20 16:04:29 crc kubenswrapper[4708]: W0320 16:04:29.945159 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65dd12f7_e1af_4213_b264_d846c01eaba8.slice/crio-1682a388b5edf6a59245c3bc7e9650fa6a473326b886393cbe58a7a7325393a9 WatchSource:0}: Error finding container 1682a388b5edf6a59245c3bc7e9650fa6a473326b886393cbe58a7a7325393a9: Status 404 returned error can't find the container with id 1682a388b5edf6a59245c3bc7e9650fa6a473326b886393cbe58a7a7325393a9 Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.008523 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" podStartSLOduration=169.008506965 podStartE2EDuration="2m49.008506965s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:30.007770003 +0000 UTC m=+224.682106718" watchObservedRunningTime="2026-03-20 16:04:30.008506965 +0000 UTC m=+224.682843670" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.052408 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.053093 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.056396 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.056654 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.062165 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.092345 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.092422 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.101278 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.116831 4708 patch_prober.go:28] interesting pod/downloads-7954f5f757-2bgzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.116849 4708 patch_prober.go:28] interesting pod/downloads-7954f5f757-2bgzp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.116890 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2bgzp" podUID="0a5fa365-0110-4a06-b2d7-cfd9b5745603" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.116881 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2bgzp" podUID="0a5fa365-0110-4a06-b2d7-cfd9b5745603" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.163886 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.182874 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7b34f95-136c-4980-ae84-8a47f06083a9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b7b34f95-136c-4980-ae84-8a47f06083a9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.182977 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7b34f95-136c-4980-ae84-8a47f06083a9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b7b34f95-136c-4980-ae84-8a47f06083a9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.196274 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mr9ct"] Mar 20 16:04:30 crc kubenswrapper[4708]: W0320 16:04:30.219356 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18d7f096_fc87_4f68_959e_5ab803b7e097.slice/crio-4bdeba6928915fff4946c25abc323c91a1898e0a01cef506e43aeebf92843cc0 WatchSource:0}: Error finding container 4bdeba6928915fff4946c25abc323c91a1898e0a01cef506e43aeebf92843cc0: Status 404 returned error can't find the container with id 4bdeba6928915fff4946c25abc323c91a1898e0a01cef506e43aeebf92843cc0 Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.262629 4708 ???:1] "http: TLS handshake error from 192.168.126.11:43284: no serving certificate available for the kubelet" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.284299 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7b34f95-136c-4980-ae84-8a47f06083a9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b7b34f95-136c-4980-ae84-8a47f06083a9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.284884 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7b34f95-136c-4980-ae84-8a47f06083a9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b7b34f95-136c-4980-ae84-8a47f06083a9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.284979 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7b34f95-136c-4980-ae84-8a47f06083a9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b7b34f95-136c-4980-ae84-8a47f06083a9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.309838 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7b34f95-136c-4980-ae84-8a47f06083a9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b7b34f95-136c-4980-ae84-8a47f06083a9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.378480 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.438531 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.439265 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.443117 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.444387 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.447921 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.488023 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1b7e911-a613-4360-87d3-cbcedcd1283e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b1b7e911-a613-4360-87d3-cbcedcd1283e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.488098 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1b7e911-a613-4360-87d3-cbcedcd1283e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b1b7e911-a613-4360-87d3-cbcedcd1283e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.589644 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1b7e911-a613-4360-87d3-cbcedcd1283e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b1b7e911-a613-4360-87d3-cbcedcd1283e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.589719 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1b7e911-a613-4360-87d3-cbcedcd1283e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b1b7e911-a613-4360-87d3-cbcedcd1283e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.589756 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1b7e911-a613-4360-87d3-cbcedcd1283e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b1b7e911-a613-4360-87d3-cbcedcd1283e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.613018 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1b7e911-a613-4360-87d3-cbcedcd1283e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b1b7e911-a613-4360-87d3-cbcedcd1283e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.777936 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.832203 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.842534 4708 patch_prober.go:28] interesting pod/router-default-5444994796-8rllx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:04:30 crc kubenswrapper[4708]: [-]has-synced failed: reason withheld Mar 20 16:04:30 crc kubenswrapper[4708]: [+]process-running ok Mar 20 16:04:30 crc kubenswrapper[4708]: healthz check failed Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.842604 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8rllx" podUID="a22db040-c541-4ade-8099-899f3581d6c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.855723 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 16:04:30 crc kubenswrapper[4708]: W0320 16:04:30.866503 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb7b34f95_136c_4980_ae84_8a47f06083a9.slice/crio-7836a764f7330e2ac6a66e3b8d626eee359098271e372d8e4f6aefa8dd9edda1 WatchSource:0}: Error finding container 7836a764f7330e2ac6a66e3b8d626eee359098271e372d8e4f6aefa8dd9edda1: Status 404 returned error can't find the container with id 7836a764f7330e2ac6a66e3b8d626eee359098271e372d8e4f6aefa8dd9edda1 Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.897053 4708 generic.go:334] "Generic (PLEG): container finished" podID="65dd12f7-e1af-4213-b264-d846c01eaba8" containerID="117d1ee6ba475b4ba00e4179f031bc9d85e357e32c3f526dd7b4f838a090027a" exitCode=0 Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.897135 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk7n9" event={"ID":"65dd12f7-e1af-4213-b264-d846c01eaba8","Type":"ContainerDied","Data":"117d1ee6ba475b4ba00e4179f031bc9d85e357e32c3f526dd7b4f838a090027a"} Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.897165 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk7n9" event={"ID":"65dd12f7-e1af-4213-b264-d846c01eaba8","Type":"ContainerStarted","Data":"1682a388b5edf6a59245c3bc7e9650fa6a473326b886393cbe58a7a7325393a9"} Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.900104 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b7b34f95-136c-4980-ae84-8a47f06083a9","Type":"ContainerStarted","Data":"7836a764f7330e2ac6a66e3b8d626eee359098271e372d8e4f6aefa8dd9edda1"} Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.904393 4708 generic.go:334] "Generic (PLEG): container finished" podID="18d7f096-fc87-4f68-959e-5ab803b7e097" containerID="2faf8b9350c6b87e2af4b910a38a655377f744285ddc5e834e918f65f0bdff72" exitCode=0 Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.905392 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr9ct" event={"ID":"18d7f096-fc87-4f68-959e-5ab803b7e097","Type":"ContainerDied","Data":"2faf8b9350c6b87e2af4b910a38a655377f744285ddc5e834e918f65f0bdff72"} Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.905426 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr9ct" event={"ID":"18d7f096-fc87-4f68-959e-5ab803b7e097","Type":"ContainerStarted","Data":"4bdeba6928915fff4946c25abc323c91a1898e0a01cef506e43aeebf92843cc0"} Mar 20 16:04:30 crc kubenswrapper[4708]: I0320 16:04:30.912706 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-rz6l8" Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.148014 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 16:04:31 crc kubenswrapper[4708]: W0320 16:04:31.197269 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb1b7e911_a613_4360_87d3_cbcedcd1283e.slice/crio-4ef04555a4b7a0a058eab7b36bce9ec4afd9e5a928443c4c44714564f7365468 WatchSource:0}: Error finding container 4ef04555a4b7a0a058eab7b36bce9ec4afd9e5a928443c4c44714564f7365468: Status 404 returned error can't find the container with id 4ef04555a4b7a0a058eab7b36bce9ec4afd9e5a928443c4c44714564f7365468 Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.458067 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.607071 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-secret-volume\") pod \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\" (UID: \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\") " Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.607180 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-config-volume\") pod \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\" (UID: \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\") " Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.607279 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sstrq\" (UniqueName: \"kubernetes.io/projected/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-kube-api-access-sstrq\") pod \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\" (UID: \"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7\") " Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.610439 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-config-volume" (OuterVolumeSpecName: "config-volume") pod "1acefb60-c1cd-4b60-8a16-c99c44f4d8c7" (UID: "1acefb60-c1cd-4b60-8a16-c99c44f4d8c7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.618766 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1acefb60-c1cd-4b60-8a16-c99c44f4d8c7" (UID: "1acefb60-c1cd-4b60-8a16-c99c44f4d8c7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.618859 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-kube-api-access-sstrq" (OuterVolumeSpecName: "kube-api-access-sstrq") pod "1acefb60-c1cd-4b60-8a16-c99c44f4d8c7" (UID: "1acefb60-c1cd-4b60-8a16-c99c44f4d8c7"). InnerVolumeSpecName "kube-api-access-sstrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.708823 4708 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.708856 4708 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.708866 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sstrq\" (UniqueName: \"kubernetes.io/projected/1acefb60-c1cd-4b60-8a16-c99c44f4d8c7-kube-api-access-sstrq\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.836409 4708 patch_prober.go:28] interesting pod/router-default-5444994796-8rllx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:04:31 crc kubenswrapper[4708]: [-]has-synced failed: reason withheld Mar 20 16:04:31 crc kubenswrapper[4708]: [+]process-running ok Mar 20 16:04:31 crc kubenswrapper[4708]: healthz check failed Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.836492 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8rllx" podUID="a22db040-c541-4ade-8099-899f3581d6c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.921047 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b1b7e911-a613-4360-87d3-cbcedcd1283e","Type":"ContainerStarted","Data":"4ef04555a4b7a0a058eab7b36bce9ec4afd9e5a928443c4c44714564f7365468"} Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.929388 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" event={"ID":"1acefb60-c1cd-4b60-8a16-c99c44f4d8c7","Type":"ContainerDied","Data":"7b781e8ab5403143556a21a3b8fa3a3c45da5a63cd10c709deba74e38a90e3dd"} Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.929450 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b781e8ab5403143556a21a3b8fa3a3c45da5a63cd10c709deba74e38a90e3dd" Mar 20 16:04:31 crc kubenswrapper[4708]: I0320 16:04:31.929687 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-v2h7s" Mar 20 16:04:32 crc kubenswrapper[4708]: I0320 16:04:32.834756 4708 patch_prober.go:28] interesting pod/router-default-5444994796-8rllx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:04:32 crc kubenswrapper[4708]: [-]has-synced failed: reason withheld Mar 20 16:04:32 crc kubenswrapper[4708]: [+]process-running ok Mar 20 16:04:32 crc kubenswrapper[4708]: healthz check failed Mar 20 16:04:32 crc kubenswrapper[4708]: I0320 16:04:32.835389 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8rllx" podUID="a22db040-c541-4ade-8099-899f3581d6c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:04:32 crc kubenswrapper[4708]: I0320 16:04:32.853319 4708 ???:1] "http: TLS handshake error from 192.168.126.11:43298: no serving certificate available for the kubelet" Mar 20 16:04:32 crc kubenswrapper[4708]: I0320 16:04:32.944714 4708 generic.go:334] "Generic (PLEG): container finished" podID="b7b34f95-136c-4980-ae84-8a47f06083a9" containerID="edeade65d8a0a1d17f638aeb6bbb5934e13dbca74ecad9644d31390c4da29078" exitCode=0 Mar 20 16:04:32 crc kubenswrapper[4708]: I0320 16:04:32.944799 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b7b34f95-136c-4980-ae84-8a47f06083a9","Type":"ContainerDied","Data":"edeade65d8a0a1d17f638aeb6bbb5934e13dbca74ecad9644d31390c4da29078"} Mar 20 16:04:32 crc kubenswrapper[4708]: I0320 16:04:32.953855 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b1b7e911-a613-4360-87d3-cbcedcd1283e","Type":"ContainerStarted","Data":"da6b6ced506ee6ad240b9d4ccd63566cbc0a6a15c70458756148414055be72fe"} Mar 20 16:04:32 crc kubenswrapper[4708]: I0320 16:04:32.982689 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.982649575 podStartE2EDuration="2.982649575s" podCreationTimestamp="2026-03-20 16:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:32.981987937 +0000 UTC m=+227.656324672" watchObservedRunningTime="2026-03-20 16:04:32.982649575 +0000 UTC m=+227.656986290" Mar 20 16:04:33 crc kubenswrapper[4708]: I0320 16:04:33.298704 4708 ???:1] "http: TLS handshake error from 192.168.126.11:43302: no serving certificate available for the kubelet" Mar 20 16:04:33 crc kubenswrapper[4708]: I0320 16:04:33.836251 4708 patch_prober.go:28] interesting pod/router-default-5444994796-8rllx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 16:04:33 crc kubenswrapper[4708]: [-]has-synced failed: reason withheld Mar 20 16:04:33 crc kubenswrapper[4708]: [+]process-running ok Mar 20 16:04:33 crc kubenswrapper[4708]: healthz check failed Mar 20 16:04:33 crc kubenswrapper[4708]: I0320 16:04:33.836336 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8rllx" podUID="a22db040-c541-4ade-8099-899f3581d6c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 16:04:33 crc kubenswrapper[4708]: I0320 16:04:33.970557 4708 generic.go:334] "Generic (PLEG): container finished" podID="b1b7e911-a613-4360-87d3-cbcedcd1283e" containerID="da6b6ced506ee6ad240b9d4ccd63566cbc0a6a15c70458756148414055be72fe" exitCode=0 Mar 20 16:04:33 crc kubenswrapper[4708]: I0320 16:04:33.971305 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b1b7e911-a613-4360-87d3-cbcedcd1283e","Type":"ContainerDied","Data":"da6b6ced506ee6ad240b9d4ccd63566cbc0a6a15c70458756148414055be72fe"} Mar 20 16:04:34 crc kubenswrapper[4708]: I0320 16:04:34.512851 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:04:34 crc kubenswrapper[4708]: I0320 16:04:34.636510 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7b34f95-136c-4980-ae84-8a47f06083a9-kube-api-access\") pod \"b7b34f95-136c-4980-ae84-8a47f06083a9\" (UID: \"b7b34f95-136c-4980-ae84-8a47f06083a9\") " Mar 20 16:04:34 crc kubenswrapper[4708]: I0320 16:04:34.636549 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7b34f95-136c-4980-ae84-8a47f06083a9-kubelet-dir\") pod \"b7b34f95-136c-4980-ae84-8a47f06083a9\" (UID: \"b7b34f95-136c-4980-ae84-8a47f06083a9\") " Mar 20 16:04:34 crc kubenswrapper[4708]: I0320 16:04:34.643423 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b34f95-136c-4980-ae84-8a47f06083a9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b7b34f95-136c-4980-ae84-8a47f06083a9" (UID: "b7b34f95-136c-4980-ae84-8a47f06083a9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:04:34 crc kubenswrapper[4708]: I0320 16:04:34.648859 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b34f95-136c-4980-ae84-8a47f06083a9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b7b34f95-136c-4980-ae84-8a47f06083a9" (UID: "b7b34f95-136c-4980-ae84-8a47f06083a9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:04:34 crc kubenswrapper[4708]: I0320 16:04:34.742324 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7b34f95-136c-4980-ae84-8a47f06083a9-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:34 crc kubenswrapper[4708]: I0320 16:04:34.742363 4708 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7b34f95-136c-4980-ae84-8a47f06083a9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:34 crc kubenswrapper[4708]: I0320 16:04:34.856259 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:34 crc kubenswrapper[4708]: I0320 16:04:34.861958 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8rllx" Mar 20 16:04:34 crc kubenswrapper[4708]: I0320 16:04:34.979212 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b7b34f95-136c-4980-ae84-8a47f06083a9","Type":"ContainerDied","Data":"7836a764f7330e2ac6a66e3b8d626eee359098271e372d8e4f6aefa8dd9edda1"} Mar 20 16:04:34 crc kubenswrapper[4708]: I0320 16:04:34.979254 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7836a764f7330e2ac6a66e3b8d626eee359098271e372d8e4f6aefa8dd9edda1" Mar 20 16:04:34 crc kubenswrapper[4708]: I0320 16:04:34.979289 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 16:04:35 crc kubenswrapper[4708]: I0320 16:04:35.353887 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:04:35 crc kubenswrapper[4708]: I0320 16:04:35.459959 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1b7e911-a613-4360-87d3-cbcedcd1283e-kubelet-dir\") pod \"b1b7e911-a613-4360-87d3-cbcedcd1283e\" (UID: \"b1b7e911-a613-4360-87d3-cbcedcd1283e\") " Mar 20 16:04:35 crc kubenswrapper[4708]: I0320 16:04:35.460031 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1b7e911-a613-4360-87d3-cbcedcd1283e-kube-api-access\") pod \"b1b7e911-a613-4360-87d3-cbcedcd1283e\" (UID: \"b1b7e911-a613-4360-87d3-cbcedcd1283e\") " Mar 20 16:04:35 crc kubenswrapper[4708]: I0320 16:04:35.460395 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1b7e911-a613-4360-87d3-cbcedcd1283e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b1b7e911-a613-4360-87d3-cbcedcd1283e" (UID: "b1b7e911-a613-4360-87d3-cbcedcd1283e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:04:35 crc kubenswrapper[4708]: I0320 16:04:35.483359 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b7e911-a613-4360-87d3-cbcedcd1283e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b1b7e911-a613-4360-87d3-cbcedcd1283e" (UID: "b1b7e911-a613-4360-87d3-cbcedcd1283e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:04:35 crc kubenswrapper[4708]: I0320 16:04:35.561762 4708 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1b7e911-a613-4360-87d3-cbcedcd1283e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:35 crc kubenswrapper[4708]: I0320 16:04:35.561800 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1b7e911-a613-4360-87d3-cbcedcd1283e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:35 crc kubenswrapper[4708]: I0320 16:04:35.950414 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qcjhm" Mar 20 16:04:35 crc kubenswrapper[4708]: I0320 16:04:35.987944 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b1b7e911-a613-4360-87d3-cbcedcd1283e","Type":"ContainerDied","Data":"4ef04555a4b7a0a058eab7b36bce9ec4afd9e5a928443c4c44714564f7365468"} Mar 20 16:04:35 crc kubenswrapper[4708]: I0320 16:04:35.987999 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ef04555a4b7a0a058eab7b36bce9ec4afd9e5a928443c4c44714564f7365468" Mar 20 16:04:35 crc kubenswrapper[4708]: I0320 16:04:35.989388 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 16:04:38 crc kubenswrapper[4708]: I0320 16:04:38.012119 4708 ???:1] "http: TLS handshake error from 192.168.126.11:40194: no serving certificate available for the kubelet" Mar 20 16:04:39 crc kubenswrapper[4708]: I0320 16:04:39.857259 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:39 crc kubenswrapper[4708]: I0320 16:04:39.861398 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:04:40 crc kubenswrapper[4708]: I0320 16:04:40.123154 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2bgzp" Mar 20 16:04:41 crc kubenswrapper[4708]: I0320 16:04:41.487803 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb"] Mar 20 16:04:41 crc kubenswrapper[4708]: I0320 16:04:41.488018 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" podUID="d1e7a051-112d-4679-821d-75c5bbe8b51f" containerName="controller-manager" containerID="cri-o://53f3d89a423eb8e436cd659ad1e256eae8d088f075b78743161bd887ead38cf1" gracePeriod=30 Mar 20 16:04:41 crc kubenswrapper[4708]: I0320 16:04:41.495270 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt"] Mar 20 16:04:41 crc kubenswrapper[4708]: I0320 16:04:41.503201 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" podUID="fcffd92a-6284-4e83-af6a-e49abbdd7387" containerName="route-controller-manager" containerID="cri-o://10fe4a6a817002218175e35983cbfa45d08229ac9b4ac86fa53d04ea5536859c" gracePeriod=30 Mar 20 16:04:42 crc kubenswrapper[4708]: I0320 16:04:42.065493 4708 generic.go:334] "Generic (PLEG): container finished" podID="fcffd92a-6284-4e83-af6a-e49abbdd7387" containerID="10fe4a6a817002218175e35983cbfa45d08229ac9b4ac86fa53d04ea5536859c" exitCode=0 Mar 20 16:04:42 crc kubenswrapper[4708]: I0320 16:04:42.065576 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" event={"ID":"fcffd92a-6284-4e83-af6a-e49abbdd7387","Type":"ContainerDied","Data":"10fe4a6a817002218175e35983cbfa45d08229ac9b4ac86fa53d04ea5536859c"} Mar 20 16:04:42 crc kubenswrapper[4708]: I0320 16:04:42.067757 4708 generic.go:334] "Generic (PLEG): container finished" podID="d1e7a051-112d-4679-821d-75c5bbe8b51f" containerID="53f3d89a423eb8e436cd659ad1e256eae8d088f075b78743161bd887ead38cf1" exitCode=0 Mar 20 16:04:42 crc kubenswrapper[4708]: I0320 16:04:42.067779 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" event={"ID":"d1e7a051-112d-4679-821d-75c5bbe8b51f","Type":"ContainerDied","Data":"53f3d89a423eb8e436cd659ad1e256eae8d088f075b78743161bd887ead38cf1"} Mar 20 16:04:46 crc kubenswrapper[4708]: I0320 16:04:46.749714 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:46 crc kubenswrapper[4708]: I0320 16:04:46.753851 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 16:04:46 crc kubenswrapper[4708]: I0320 16:04:46.765377 4708 patch_prober.go:28] interesting pod/controller-manager-5d7fb5f6bc-bqtpb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:04:46 crc kubenswrapper[4708]: I0320 16:04:46.765731 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" podUID="d1e7a051-112d-4679-821d-75c5bbe8b51f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:04:46 crc kubenswrapper[4708]: I0320 16:04:46.781242 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3574461f-8c2b-446b-a2f1-c1be3a8d7824-metrics-certs\") pod \"network-metrics-daemon-gtlzm\" (UID: \"3574461f-8c2b-446b-a2f1-c1be3a8d7824\") " pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:46 crc kubenswrapper[4708]: I0320 16:04:46.830907 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 16:04:46 crc kubenswrapper[4708]: I0320 16:04:46.837988 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gtlzm" Mar 20 16:04:47 crc kubenswrapper[4708]: E0320 16:04:47.501421 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 16:04:47 crc kubenswrapper[4708]: E0320 16:04:47.501621 4708 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:04:47 crc kubenswrapper[4708]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 16:04:47 crc kubenswrapper[4708]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9hx92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29567044-8vwl5_openshift-infra(69fd6ec4-db77-4309-977a-7e80359aec50): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 16:04:47 crc kubenswrapper[4708]: > logger="UnhandledError" Mar 20 16:04:47 crc kubenswrapper[4708]: E0320 16:04:47.505531 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29567044-8vwl5" podUID="69fd6ec4-db77-4309-977a-7e80359aec50" Mar 20 16:04:48 crc kubenswrapper[4708]: E0320 16:04:48.102492 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29567044-8vwl5" podUID="69fd6ec4-db77-4309-977a-7e80359aec50" Mar 20 16:04:48 crc kubenswrapper[4708]: I0320 16:04:48.922049 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.292336 4708 patch_prober.go:28] interesting pod/route-controller-manager-7bd4f69545-56vpt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.292726 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" podUID="fcffd92a-6284-4e83-af6a-e49abbdd7387" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.554979 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.562393 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.600588 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d876b5d65-7h5pt"] Mar 20 16:04:49 crc kubenswrapper[4708]: E0320 16:04:49.604033 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e7a051-112d-4679-821d-75c5bbe8b51f" containerName="controller-manager" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.604071 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e7a051-112d-4679-821d-75c5bbe8b51f" containerName="controller-manager" Mar 20 16:04:49 crc kubenswrapper[4708]: E0320 16:04:49.604092 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1acefb60-c1cd-4b60-8a16-c99c44f4d8c7" containerName="collect-profiles" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.604102 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acefb60-c1cd-4b60-8a16-c99c44f4d8c7" containerName="collect-profiles" Mar 20 16:04:49 crc kubenswrapper[4708]: E0320 16:04:49.604116 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcffd92a-6284-4e83-af6a-e49abbdd7387" containerName="route-controller-manager" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.604124 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcffd92a-6284-4e83-af6a-e49abbdd7387" containerName="route-controller-manager" Mar 20 16:04:49 crc kubenswrapper[4708]: E0320 16:04:49.604143 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b34f95-136c-4980-ae84-8a47f06083a9" containerName="pruner" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.604152 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b34f95-136c-4980-ae84-8a47f06083a9" containerName="pruner" Mar 20 16:04:49 crc kubenswrapper[4708]: E0320 16:04:49.604169 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b7e911-a613-4360-87d3-cbcedcd1283e" containerName="pruner" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.604178 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b7e911-a613-4360-87d3-cbcedcd1283e" containerName="pruner" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.604397 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e7a051-112d-4679-821d-75c5bbe8b51f" containerName="controller-manager" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.604416 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b34f95-136c-4980-ae84-8a47f06083a9" containerName="pruner" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.604428 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b7e911-a613-4360-87d3-cbcedcd1283e" containerName="pruner" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.604438 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="1acefb60-c1cd-4b60-8a16-c99c44f4d8c7" containerName="collect-profiles" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.604449 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcffd92a-6284-4e83-af6a-e49abbdd7387" containerName="route-controller-manager" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.605643 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.610818 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d876b5d65-7h5pt"] Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.686850 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-client-ca\") pod \"d1e7a051-112d-4679-821d-75c5bbe8b51f\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.686904 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcffd92a-6284-4e83-af6a-e49abbdd7387-serving-cert\") pod \"fcffd92a-6284-4e83-af6a-e49abbdd7387\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.686932 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-config\") pod \"d1e7a051-112d-4679-821d-75c5bbe8b51f\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.686953 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e7a051-112d-4679-821d-75c5bbe8b51f-serving-cert\") pod \"d1e7a051-112d-4679-821d-75c5bbe8b51f\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.686987 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcffd92a-6284-4e83-af6a-e49abbdd7387-client-ca\") pod \"fcffd92a-6284-4e83-af6a-e49abbdd7387\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.687020 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-proxy-ca-bundles\") pod \"d1e7a051-112d-4679-821d-75c5bbe8b51f\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.687064 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx8lr\" (UniqueName: \"kubernetes.io/projected/fcffd92a-6284-4e83-af6a-e49abbdd7387-kube-api-access-cx8lr\") pod \"fcffd92a-6284-4e83-af6a-e49abbdd7387\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.687118 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcffd92a-6284-4e83-af6a-e49abbdd7387-config\") pod \"fcffd92a-6284-4e83-af6a-e49abbdd7387\" (UID: \"fcffd92a-6284-4e83-af6a-e49abbdd7387\") " Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.687148 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh5m8\" (UniqueName: \"kubernetes.io/projected/d1e7a051-112d-4679-821d-75c5bbe8b51f-kube-api-access-zh5m8\") pod \"d1e7a051-112d-4679-821d-75c5bbe8b51f\" (UID: \"d1e7a051-112d-4679-821d-75c5bbe8b51f\") " Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.688648 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-config" (OuterVolumeSpecName: "config") pod "d1e7a051-112d-4679-821d-75c5bbe8b51f" (UID: "d1e7a051-112d-4679-821d-75c5bbe8b51f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.689211 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d1e7a051-112d-4679-821d-75c5bbe8b51f" (UID: "d1e7a051-112d-4679-821d-75c5bbe8b51f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.689224 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-client-ca" (OuterVolumeSpecName: "client-ca") pod "d1e7a051-112d-4679-821d-75c5bbe8b51f" (UID: "d1e7a051-112d-4679-821d-75c5bbe8b51f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.689572 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcffd92a-6284-4e83-af6a-e49abbdd7387-client-ca" (OuterVolumeSpecName: "client-ca") pod "fcffd92a-6284-4e83-af6a-e49abbdd7387" (UID: "fcffd92a-6284-4e83-af6a-e49abbdd7387"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.689980 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcffd92a-6284-4e83-af6a-e49abbdd7387-config" (OuterVolumeSpecName: "config") pod "fcffd92a-6284-4e83-af6a-e49abbdd7387" (UID: "fcffd92a-6284-4e83-af6a-e49abbdd7387"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.693743 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcffd92a-6284-4e83-af6a-e49abbdd7387-kube-api-access-cx8lr" (OuterVolumeSpecName: "kube-api-access-cx8lr") pod "fcffd92a-6284-4e83-af6a-e49abbdd7387" (UID: "fcffd92a-6284-4e83-af6a-e49abbdd7387"). InnerVolumeSpecName "kube-api-access-cx8lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.693744 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcffd92a-6284-4e83-af6a-e49abbdd7387-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fcffd92a-6284-4e83-af6a-e49abbdd7387" (UID: "fcffd92a-6284-4e83-af6a-e49abbdd7387"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.694465 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e7a051-112d-4679-821d-75c5bbe8b51f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d1e7a051-112d-4679-821d-75c5bbe8b51f" (UID: "d1e7a051-112d-4679-821d-75c5bbe8b51f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.697003 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e7a051-112d-4679-821d-75c5bbe8b51f-kube-api-access-zh5m8" (OuterVolumeSpecName: "kube-api-access-zh5m8") pod "d1e7a051-112d-4679-821d-75c5bbe8b51f" (UID: "d1e7a051-112d-4679-821d-75c5bbe8b51f"). InnerVolumeSpecName "kube-api-access-zh5m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.788503 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-config\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.788576 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a247534-75a9-40c2-8777-22d5f7204364-serving-cert\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.788614 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr67s\" (UniqueName: \"kubernetes.io/projected/0a247534-75a9-40c2-8777-22d5f7204364-kube-api-access-cr67s\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.788848 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-proxy-ca-bundles\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.788915 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-client-ca\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.789148 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcffd92a-6284-4e83-af6a-e49abbdd7387-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.789173 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh5m8\" (UniqueName: \"kubernetes.io/projected/d1e7a051-112d-4679-821d-75c5bbe8b51f-kube-api-access-zh5m8\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.789188 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.789202 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcffd92a-6284-4e83-af6a-e49abbdd7387-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.789215 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.789230 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e7a051-112d-4679-821d-75c5bbe8b51f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.789243 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcffd92a-6284-4e83-af6a-e49abbdd7387-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.789255 4708 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1e7a051-112d-4679-821d-75c5bbe8b51f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.789267 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx8lr\" (UniqueName: \"kubernetes.io/projected/fcffd92a-6284-4e83-af6a-e49abbdd7387-kube-api-access-cx8lr\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.890736 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a247534-75a9-40c2-8777-22d5f7204364-serving-cert\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.890810 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr67s\" (UniqueName: \"kubernetes.io/projected/0a247534-75a9-40c2-8777-22d5f7204364-kube-api-access-cr67s\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.890867 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-proxy-ca-bundles\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.890898 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-client-ca\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.891015 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-config\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.891969 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-client-ca\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.893034 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-config\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.895393 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-proxy-ca-bundles\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.901692 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a247534-75a9-40c2-8777-22d5f7204364-serving-cert\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.909255 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr67s\" (UniqueName: \"kubernetes.io/projected/0a247534-75a9-40c2-8777-22d5f7204364-kube-api-access-cr67s\") pod \"controller-manager-7d876b5d65-7h5pt\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:49 crc kubenswrapper[4708]: I0320 16:04:49.933050 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:04:50 crc kubenswrapper[4708]: I0320 16:04:50.110409 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" event={"ID":"fcffd92a-6284-4e83-af6a-e49abbdd7387","Type":"ContainerDied","Data":"9a56bbe2bd047aa5da74bc46e371cb2e8bc7b78ccd242a555094ef43033354e4"} Mar 20 16:04:50 crc kubenswrapper[4708]: I0320 16:04:50.110469 4708 scope.go:117] "RemoveContainer" containerID="10fe4a6a817002218175e35983cbfa45d08229ac9b4ac86fa53d04ea5536859c" Mar 20 16:04:50 crc kubenswrapper[4708]: I0320 16:04:50.110581 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt" Mar 20 16:04:50 crc kubenswrapper[4708]: I0320 16:04:50.113502 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" Mar 20 16:04:50 crc kubenswrapper[4708]: I0320 16:04:50.121024 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb" event={"ID":"d1e7a051-112d-4679-821d-75c5bbe8b51f","Type":"ContainerDied","Data":"b2790354ffa080fb818d2c7c21a241535103e5673e3659dd4ab1e0d01c6b14d7"} Mar 20 16:04:50 crc kubenswrapper[4708]: I0320 16:04:50.148197 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt"] Mar 20 16:04:50 crc kubenswrapper[4708]: I0320 16:04:50.154292 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bd4f69545-56vpt"] Mar 20 16:04:50 crc kubenswrapper[4708]: I0320 16:04:50.155634 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb"] Mar 20 16:04:50 crc kubenswrapper[4708]: I0320 16:04:50.158094 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d7fb5f6bc-bqtpb"] Mar 20 16:04:51 crc kubenswrapper[4708]: I0320 16:04:51.942914 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75"] Mar 20 16:04:51 crc kubenswrapper[4708]: I0320 16:04:51.943713 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:51 crc kubenswrapper[4708]: I0320 16:04:51.945283 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:04:51 crc kubenswrapper[4708]: I0320 16:04:51.945877 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4708]: I0320 16:04:51.946558 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:04:51 crc kubenswrapper[4708]: I0320 16:04:51.947076 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:04:51 crc kubenswrapper[4708]: I0320 16:04:51.947241 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:04:51 crc kubenswrapper[4708]: I0320 16:04:51.947287 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:04:51 crc kubenswrapper[4708]: I0320 16:04:51.949301 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75"] Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.117192 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e7a051-112d-4679-821d-75c5bbe8b51f" path="/var/lib/kubelet/pods/d1e7a051-112d-4679-821d-75c5bbe8b51f/volumes" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.118049 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcffd92a-6284-4e83-af6a-e49abbdd7387" path="/var/lib/kubelet/pods/fcffd92a-6284-4e83-af6a-e49abbdd7387/volumes" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.127396 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf4t8\" (UniqueName: \"kubernetes.io/projected/db015266-db58-4200-93a3-a5ace1c276ee-kube-api-access-rf4t8\") pod \"route-controller-manager-54cbdcf87b-w7p75\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.127446 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db015266-db58-4200-93a3-a5ace1c276ee-client-ca\") pod \"route-controller-manager-54cbdcf87b-w7p75\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.127472 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db015266-db58-4200-93a3-a5ace1c276ee-serving-cert\") pod \"route-controller-manager-54cbdcf87b-w7p75\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.127513 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db015266-db58-4200-93a3-a5ace1c276ee-config\") pod \"route-controller-manager-54cbdcf87b-w7p75\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.228253 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf4t8\" (UniqueName: \"kubernetes.io/projected/db015266-db58-4200-93a3-a5ace1c276ee-kube-api-access-rf4t8\") pod \"route-controller-manager-54cbdcf87b-w7p75\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.228310 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db015266-db58-4200-93a3-a5ace1c276ee-client-ca\") pod \"route-controller-manager-54cbdcf87b-w7p75\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.228341 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db015266-db58-4200-93a3-a5ace1c276ee-serving-cert\") pod \"route-controller-manager-54cbdcf87b-w7p75\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.228395 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db015266-db58-4200-93a3-a5ace1c276ee-config\") pod \"route-controller-manager-54cbdcf87b-w7p75\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.229416 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db015266-db58-4200-93a3-a5ace1c276ee-client-ca\") pod \"route-controller-manager-54cbdcf87b-w7p75\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.229720 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db015266-db58-4200-93a3-a5ace1c276ee-config\") pod \"route-controller-manager-54cbdcf87b-w7p75\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.245512 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf4t8\" (UniqueName: \"kubernetes.io/projected/db015266-db58-4200-93a3-a5ace1c276ee-kube-api-access-rf4t8\") pod \"route-controller-manager-54cbdcf87b-w7p75\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.250454 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db015266-db58-4200-93a3-a5ace1c276ee-serving-cert\") pod \"route-controller-manager-54cbdcf87b-w7p75\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:52 crc kubenswrapper[4708]: I0320 16:04:52.306811 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:04:56 crc kubenswrapper[4708]: I0320 16:04:56.178755 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:04:56 crc kubenswrapper[4708]: I0320 16:04:56.178862 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:04:57 crc kubenswrapper[4708]: I0320 16:04:57.968662 4708 scope.go:117] "RemoveContainer" containerID="53f3d89a423eb8e436cd659ad1e256eae8d088f075b78743161bd887ead38cf1" Mar 20 16:04:58 crc kubenswrapper[4708]: I0320 16:04:58.516648 4708 ???:1] "http: TLS handshake error from 192.168.126.11:38670: no serving certificate available for the kubelet" Mar 20 16:05:00 crc kubenswrapper[4708]: I0320 16:05:00.966112 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-696kc" Mar 20 16:05:01 crc kubenswrapper[4708]: I0320 16:05:01.485312 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d876b5d65-7h5pt"] Mar 20 16:05:01 crc kubenswrapper[4708]: I0320 16:05:01.587705 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75"] Mar 20 16:05:02 crc kubenswrapper[4708]: E0320 16:05:02.048309 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 16:05:02 crc kubenswrapper[4708]: E0320 16:05:02.048491 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hhc5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6675d_openshift-marketplace(b7a193fd-f6c4-4779-ba32-d746f05094c1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 16:05:02 crc kubenswrapper[4708]: E0320 16:05:02.049893 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6675d" podUID="b7a193fd-f6c4-4779-ba32-d746f05094c1" Mar 20 16:05:03 crc kubenswrapper[4708]: I0320 16:05:03.039562 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 16:05:03 crc kubenswrapper[4708]: I0320 16:05:03.040335 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:03 crc kubenswrapper[4708]: I0320 16:05:03.043731 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 16:05:03 crc kubenswrapper[4708]: I0320 16:05:03.043872 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 16:05:03 crc kubenswrapper[4708]: I0320 16:05:03.046631 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 16:05:03 crc kubenswrapper[4708]: I0320 16:05:03.091533 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71c31cfe-5dd9-42a4-899e-1cac6b1121f3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71c31cfe-5dd9-42a4-899e-1cac6b1121f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:03 crc kubenswrapper[4708]: I0320 16:05:03.091600 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71c31cfe-5dd9-42a4-899e-1cac6b1121f3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71c31cfe-5dd9-42a4-899e-1cac6b1121f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:03 crc kubenswrapper[4708]: I0320 16:05:03.193088 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71c31cfe-5dd9-42a4-899e-1cac6b1121f3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71c31cfe-5dd9-42a4-899e-1cac6b1121f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:03 crc kubenswrapper[4708]: I0320 16:05:03.193143 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71c31cfe-5dd9-42a4-899e-1cac6b1121f3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71c31cfe-5dd9-42a4-899e-1cac6b1121f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:03 crc kubenswrapper[4708]: I0320 16:05:03.193323 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71c31cfe-5dd9-42a4-899e-1cac6b1121f3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71c31cfe-5dd9-42a4-899e-1cac6b1121f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:03 crc kubenswrapper[4708]: I0320 16:05:03.221890 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71c31cfe-5dd9-42a4-899e-1cac6b1121f3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71c31cfe-5dd9-42a4-899e-1cac6b1121f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:03 crc kubenswrapper[4708]: I0320 16:05:03.370298 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:04 crc kubenswrapper[4708]: E0320 16:05:04.957639 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6675d" podUID="b7a193fd-f6c4-4779-ba32-d746f05094c1" Mar 20 16:05:05 crc kubenswrapper[4708]: E0320 16:05:05.062008 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 16:05:05 crc kubenswrapper[4708]: E0320 16:05:05.062192 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n2vz6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vzncs_openshift-marketplace(0d8b3c28-42d0-479b-b45d-0fe00b8cb36d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 16:05:05 crc kubenswrapper[4708]: E0320 16:05:05.064979 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vzncs" podUID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" Mar 20 16:05:05 crc kubenswrapper[4708]: E0320 16:05:05.073051 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 16:05:05 crc kubenswrapper[4708]: E0320 16:05:05.073290 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m9mnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hvrxr_openshift-marketplace(bece7d1b-b5d8-4762-b0b8-b2752c422776): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 16:05:05 crc kubenswrapper[4708]: E0320 16:05:05.074612 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hvrxr" podUID="bece7d1b-b5d8-4762-b0b8-b2752c422776" Mar 20 16:05:06 crc kubenswrapper[4708]: E0320 16:05:06.788134 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hvrxr" podUID="bece7d1b-b5d8-4762-b0b8-b2752c422776" Mar 20 16:05:06 crc kubenswrapper[4708]: E0320 16:05:06.788170 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vzncs" podUID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" Mar 20 16:05:06 crc kubenswrapper[4708]: E0320 16:05:06.864222 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 16:05:06 crc kubenswrapper[4708]: E0320 16:05:06.864736 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q884g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5frmt_openshift-marketplace(cdc1f7ff-9725-4049-b06c-50a4adfa1696): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 16:05:06 crc kubenswrapper[4708]: E0320 16:05:06.865952 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5frmt" podUID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" Mar 20 16:05:07 crc kubenswrapper[4708]: I0320 16:05:07.006656 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gtlzm"] Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.427263 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.429079 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.437564 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.494195 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebff241e-ba92-403f-991f-82c78be60f16-kube-api-access\") pod \"installer-9-crc\" (UID: \"ebff241e-ba92-403f-991f-82c78be60f16\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.494319 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ebff241e-ba92-403f-991f-82c78be60f16-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ebff241e-ba92-403f-991f-82c78be60f16\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.494383 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebff241e-ba92-403f-991f-82c78be60f16-var-lock\") pod \"installer-9-crc\" (UID: \"ebff241e-ba92-403f-991f-82c78be60f16\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.595560 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebff241e-ba92-403f-991f-82c78be60f16-kube-api-access\") pod \"installer-9-crc\" (UID: \"ebff241e-ba92-403f-991f-82c78be60f16\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.595694 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ebff241e-ba92-403f-991f-82c78be60f16-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ebff241e-ba92-403f-991f-82c78be60f16\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.595728 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebff241e-ba92-403f-991f-82c78be60f16-var-lock\") pod \"installer-9-crc\" (UID: \"ebff241e-ba92-403f-991f-82c78be60f16\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.595815 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebff241e-ba92-403f-991f-82c78be60f16-var-lock\") pod \"installer-9-crc\" (UID: \"ebff241e-ba92-403f-991f-82c78be60f16\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.595990 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ebff241e-ba92-403f-991f-82c78be60f16-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ebff241e-ba92-403f-991f-82c78be60f16\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.622788 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebff241e-ba92-403f-991f-82c78be60f16-kube-api-access\") pod \"installer-9-crc\" (UID: \"ebff241e-ba92-403f-991f-82c78be60f16\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:08 crc kubenswrapper[4708]: I0320 16:05:08.763328 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:10 crc kubenswrapper[4708]: E0320 16:05:10.548244 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5frmt" podUID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" Mar 20 16:05:10 crc kubenswrapper[4708]: E0320 16:05:10.624438 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 16:05:10 crc kubenswrapper[4708]: E0320 16:05:10.624717 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xvqzm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mr9ct_openshift-marketplace(18d7f096-fc87-4f68-959e-5ab803b7e097): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 16:05:10 crc kubenswrapper[4708]: E0320 16:05:10.625946 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-mr9ct" podUID="18d7f096-fc87-4f68-959e-5ab803b7e097" Mar 20 16:05:10 crc kubenswrapper[4708]: I0320 16:05:10.895744 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 16:05:11 crc kubenswrapper[4708]: I0320 16:05:11.007331 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d876b5d65-7h5pt"] Mar 20 16:05:11 crc kubenswrapper[4708]: W0320 16:05:11.021463 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a247534_75a9_40c2_8777_22d5f7204364.slice/crio-28edc1e1ce80b19528e85e233d182cf4cf5cc9aa9c319ba28391a858ac972050 WatchSource:0}: Error finding container 28edc1e1ce80b19528e85e233d182cf4cf5cc9aa9c319ba28391a858ac972050: Status 404 returned error can't find the container with id 28edc1e1ce80b19528e85e233d182cf4cf5cc9aa9c319ba28391a858ac972050 Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.026407 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.026536 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bwtbf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dk7n9_openshift-marketplace(65dd12f7-e1af-4213-b264-d846c01eaba8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.028404 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dk7n9" podUID="65dd12f7-e1af-4213-b264-d846c01eaba8" Mar 20 16:05:11 crc kubenswrapper[4708]: I0320 16:05:11.054811 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 16:05:11 crc kubenswrapper[4708]: I0320 16:05:11.060880 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75"] Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.082021 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.082221 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7r5tz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-pmd2p_openshift-marketplace(b626637b-4d54-4743-9717-142fe62e392b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.083450 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-pmd2p" podUID="b626637b-4d54-4743-9717-142fe62e392b" Mar 20 16:05:11 crc kubenswrapper[4708]: W0320 16:05:11.142466 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb015266_db58_4200_93a3_a5ace1c276ee.slice/crio-cd2bdb8a276558f96afe36b52ac7b98374fa014470c745e90a88feb8c71c7b71 WatchSource:0}: Error finding container cd2bdb8a276558f96afe36b52ac7b98374fa014470c745e90a88feb8c71c7b71: Status 404 returned error can't find the container with id cd2bdb8a276558f96afe36b52ac7b98374fa014470c745e90a88feb8c71c7b71 Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.185863 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.186106 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wh56j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-h4kcq_openshift-marketplace(ca697a79-760d-4ae1-827f-bc2b0aee1785): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.187478 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-h4kcq" podUID="ca697a79-760d-4ae1-827f-bc2b0aee1785" Mar 20 16:05:11 crc kubenswrapper[4708]: I0320 16:05:11.226768 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" event={"ID":"3574461f-8c2b-446b-a2f1-c1be3a8d7824","Type":"ContainerStarted","Data":"6a5632d05de65bd25e133bcc3842b59b9895ca2b5a133c85b82abe7bcc648f5e"} Mar 20 16:05:11 crc kubenswrapper[4708]: I0320 16:05:11.226826 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" event={"ID":"3574461f-8c2b-446b-a2f1-c1be3a8d7824","Type":"ContainerStarted","Data":"07310d886be8cd396f3653d8968d6741a2395d4c5e7adf9796aa3cca4690969a"} Mar 20 16:05:11 crc kubenswrapper[4708]: I0320 16:05:11.228752 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" event={"ID":"0a247534-75a9-40c2-8777-22d5f7204364","Type":"ContainerStarted","Data":"28edc1e1ce80b19528e85e233d182cf4cf5cc9aa9c319ba28391a858ac972050"} Mar 20 16:05:11 crc kubenswrapper[4708]: I0320 16:05:11.230362 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" event={"ID":"db015266-db58-4200-93a3-a5ace1c276ee","Type":"ContainerStarted","Data":"cd2bdb8a276558f96afe36b52ac7b98374fa014470c745e90a88feb8c71c7b71"} Mar 20 16:05:11 crc kubenswrapper[4708]: I0320 16:05:11.233450 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71c31cfe-5dd9-42a4-899e-1cac6b1121f3","Type":"ContainerStarted","Data":"9888ab78d3ee7a939b3fd5ea9de77a52d4f5b19fb8f90886fe33af4438547d52"} Mar 20 16:05:11 crc kubenswrapper[4708]: I0320 16:05:11.235695 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ebff241e-ba92-403f-991f-82c78be60f16","Type":"ContainerStarted","Data":"06c7b359940d530d1b2c71c3de3dc0dcd2cb17da503f885a1606c85fa4bf12b2"} Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.238009 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-h4kcq" podUID="ca697a79-760d-4ae1-827f-bc2b0aee1785" Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.238128 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dk7n9" podUID="65dd12f7-e1af-4213-b264-d846c01eaba8" Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.238222 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-pmd2p" podUID="b626637b-4d54-4743-9717-142fe62e392b" Mar 20 16:05:11 crc kubenswrapper[4708]: E0320 16:05:11.238270 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-mr9ct" podUID="18d7f096-fc87-4f68-959e-5ab803b7e097" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.015833 4708 csr.go:261] certificate signing request csr-fmw7v is approved, waiting to be issued Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.025986 4708 csr.go:257] certificate signing request csr-fmw7v is issued Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.244348 4708 generic.go:334] "Generic (PLEG): container finished" podID="71c31cfe-5dd9-42a4-899e-1cac6b1121f3" containerID="923fa9d28c8809c73869c5aff28838674f38815581b10cf468678e1eb6405240" exitCode=0 Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.244422 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71c31cfe-5dd9-42a4-899e-1cac6b1121f3","Type":"ContainerDied","Data":"923fa9d28c8809c73869c5aff28838674f38815581b10cf468678e1eb6405240"} Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.249167 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ebff241e-ba92-403f-991f-82c78be60f16","Type":"ContainerStarted","Data":"819ce2cd55ffebd0c67c059a21d116a16fa4b0853e0e2fbc5d1d0eacb6c0b208"} Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.250643 4708 generic.go:334] "Generic (PLEG): container finished" podID="69fd6ec4-db77-4309-977a-7e80359aec50" containerID="641fb982898e899e43560465ef8157ebbd867e59cf3b51e230c9ea13b27cba66" exitCode=0 Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.250734 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-8vwl5" event={"ID":"69fd6ec4-db77-4309-977a-7e80359aec50","Type":"ContainerDied","Data":"641fb982898e899e43560465ef8157ebbd867e59cf3b51e230c9ea13b27cba66"} Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.252905 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gtlzm" event={"ID":"3574461f-8c2b-446b-a2f1-c1be3a8d7824","Type":"ContainerStarted","Data":"bc9c116b04921fcd3796ec59febaed5bd9ad3f0f9c2b0401928e80710624b553"} Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.254298 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" event={"ID":"0a247534-75a9-40c2-8777-22d5f7204364","Type":"ContainerStarted","Data":"b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be"} Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.254379 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" podUID="0a247534-75a9-40c2-8777-22d5f7204364" containerName="controller-manager" containerID="cri-o://b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be" gracePeriod=30 Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.254554 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.263179 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" event={"ID":"db015266-db58-4200-93a3-a5ace1c276ee","Type":"ContainerStarted","Data":"47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263"} Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.263399 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" podUID="db015266-db58-4200-93a3-a5ace1c276ee" containerName="route-controller-manager" containerID="cri-o://47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263" gracePeriod=30 Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.264067 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.269184 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.282580 4708 patch_prober.go:28] interesting pod/controller-manager-7d876b5d65-7h5pt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": EOF" start-of-body= Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.283139 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" podUID="0a247534-75a9-40c2-8777-22d5f7204364" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": EOF" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.288366 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.28833234 podStartE2EDuration="4.28833234s" podCreationTimestamp="2026-03-20 16:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:12.277468213 +0000 UTC m=+266.951804948" watchObservedRunningTime="2026-03-20 16:05:12.28833234 +0000 UTC m=+266.962669055" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.309430 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gtlzm" podStartSLOduration=211.309415275 podStartE2EDuration="3m31.309415275s" podCreationTimestamp="2026-03-20 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:12.30815406 +0000 UTC m=+266.982490775" watchObservedRunningTime="2026-03-20 16:05:12.309415275 +0000 UTC m=+266.983751990" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.311425 4708 patch_prober.go:28] interesting pod/route-controller-manager-54cbdcf87b-w7p75 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.311499 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" podUID="db015266-db58-4200-93a3-a5ace1c276ee" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.327444 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" podStartSLOduration=31.327422783 podStartE2EDuration="31.327422783s" podCreationTimestamp="2026-03-20 16:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:12.326483608 +0000 UTC m=+267.000820323" watchObservedRunningTime="2026-03-20 16:05:12.327422783 +0000 UTC m=+267.001759498" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.347513 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" podStartSLOduration=31.347488711 podStartE2EDuration="31.347488711s" podCreationTimestamp="2026-03-20 16:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:12.34425949 +0000 UTC m=+267.018596205" watchObservedRunningTime="2026-03-20 16:05:12.347488711 +0000 UTC m=+267.021825426" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.686967 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.692104 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.713506 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2"] Mar 20 16:05:12 crc kubenswrapper[4708]: E0320 16:05:12.713879 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db015266-db58-4200-93a3-a5ace1c276ee" containerName="route-controller-manager" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.713902 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="db015266-db58-4200-93a3-a5ace1c276ee" containerName="route-controller-manager" Mar 20 16:05:12 crc kubenswrapper[4708]: E0320 16:05:12.713945 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a247534-75a9-40c2-8777-22d5f7204364" containerName="controller-manager" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.713954 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a247534-75a9-40c2-8777-22d5f7204364" containerName="controller-manager" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.714109 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a247534-75a9-40c2-8777-22d5f7204364" containerName="controller-manager" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.714122 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="db015266-db58-4200-93a3-a5ace1c276ee" containerName="route-controller-manager" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.714737 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.748922 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2"] Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765410 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db015266-db58-4200-93a3-a5ace1c276ee-client-ca\") pod \"db015266-db58-4200-93a3-a5ace1c276ee\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765481 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db015266-db58-4200-93a3-a5ace1c276ee-config\") pod \"db015266-db58-4200-93a3-a5ace1c276ee\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765536 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr67s\" (UniqueName: \"kubernetes.io/projected/0a247534-75a9-40c2-8777-22d5f7204364-kube-api-access-cr67s\") pod \"0a247534-75a9-40c2-8777-22d5f7204364\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765567 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a247534-75a9-40c2-8777-22d5f7204364-serving-cert\") pod \"0a247534-75a9-40c2-8777-22d5f7204364\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765616 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-proxy-ca-bundles\") pod \"0a247534-75a9-40c2-8777-22d5f7204364\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765636 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db015266-db58-4200-93a3-a5ace1c276ee-serving-cert\") pod \"db015266-db58-4200-93a3-a5ace1c276ee\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765663 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-client-ca\") pod \"0a247534-75a9-40c2-8777-22d5f7204364\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765702 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-config\") pod \"0a247534-75a9-40c2-8777-22d5f7204364\" (UID: \"0a247534-75a9-40c2-8777-22d5f7204364\") " Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765718 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf4t8\" (UniqueName: \"kubernetes.io/projected/db015266-db58-4200-93a3-a5ace1c276ee-kube-api-access-rf4t8\") pod \"db015266-db58-4200-93a3-a5ace1c276ee\" (UID: \"db015266-db58-4200-93a3-a5ace1c276ee\") " Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765901 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq54s\" (UniqueName: \"kubernetes.io/projected/902d92e4-0099-45a9-8d18-13dfaa5613f6-kube-api-access-mq54s\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765929 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-proxy-ca-bundles\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765963 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902d92e4-0099-45a9-8d18-13dfaa5613f6-serving-cert\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.765987 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-client-ca\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.766036 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-config\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.766251 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db015266-db58-4200-93a3-a5ace1c276ee-client-ca" (OuterVolumeSpecName: "client-ca") pod "db015266-db58-4200-93a3-a5ace1c276ee" (UID: "db015266-db58-4200-93a3-a5ace1c276ee"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.767120 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0a247534-75a9-40c2-8777-22d5f7204364" (UID: "0a247534-75a9-40c2-8777-22d5f7204364"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.767146 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-config" (OuterVolumeSpecName: "config") pod "0a247534-75a9-40c2-8777-22d5f7204364" (UID: "0a247534-75a9-40c2-8777-22d5f7204364"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.767130 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-client-ca" (OuterVolumeSpecName: "client-ca") pod "0a247534-75a9-40c2-8777-22d5f7204364" (UID: "0a247534-75a9-40c2-8777-22d5f7204364"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.767410 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db015266-db58-4200-93a3-a5ace1c276ee-config" (OuterVolumeSpecName: "config") pod "db015266-db58-4200-93a3-a5ace1c276ee" (UID: "db015266-db58-4200-93a3-a5ace1c276ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.773051 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a247534-75a9-40c2-8777-22d5f7204364-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0a247534-75a9-40c2-8777-22d5f7204364" (UID: "0a247534-75a9-40c2-8777-22d5f7204364"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.774116 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db015266-db58-4200-93a3-a5ace1c276ee-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "db015266-db58-4200-93a3-a5ace1c276ee" (UID: "db015266-db58-4200-93a3-a5ace1c276ee"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.774899 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db015266-db58-4200-93a3-a5ace1c276ee-kube-api-access-rf4t8" (OuterVolumeSpecName: "kube-api-access-rf4t8") pod "db015266-db58-4200-93a3-a5ace1c276ee" (UID: "db015266-db58-4200-93a3-a5ace1c276ee"). InnerVolumeSpecName "kube-api-access-rf4t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.776296 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a247534-75a9-40c2-8777-22d5f7204364-kube-api-access-cr67s" (OuterVolumeSpecName: "kube-api-access-cr67s") pod "0a247534-75a9-40c2-8777-22d5f7204364" (UID: "0a247534-75a9-40c2-8777-22d5f7204364"). InnerVolumeSpecName "kube-api-access-cr67s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.867818 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-client-ca\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868026 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-config\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868085 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq54s\" (UniqueName: \"kubernetes.io/projected/902d92e4-0099-45a9-8d18-13dfaa5613f6-kube-api-access-mq54s\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868107 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-proxy-ca-bundles\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868141 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902d92e4-0099-45a9-8d18-13dfaa5613f6-serving-cert\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868190 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr67s\" (UniqueName: \"kubernetes.io/projected/0a247534-75a9-40c2-8777-22d5f7204364-kube-api-access-cr67s\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868201 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a247534-75a9-40c2-8777-22d5f7204364-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868212 4708 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868221 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db015266-db58-4200-93a3-a5ace1c276ee-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868230 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868241 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a247534-75a9-40c2-8777-22d5f7204364-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868253 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf4t8\" (UniqueName: \"kubernetes.io/projected/db015266-db58-4200-93a3-a5ace1c276ee-kube-api-access-rf4t8\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868263 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db015266-db58-4200-93a3-a5ace1c276ee-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.868272 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db015266-db58-4200-93a3-a5ace1c276ee-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.869330 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-proxy-ca-bundles\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.871005 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-client-ca\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.871411 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-config\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.871568 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902d92e4-0099-45a9-8d18-13dfaa5613f6-serving-cert\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:12 crc kubenswrapper[4708]: I0320 16:05:12.885064 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq54s\" (UniqueName: \"kubernetes.io/projected/902d92e4-0099-45a9-8d18-13dfaa5613f6-kube-api-access-mq54s\") pod \"controller-manager-7987d8fbc7-l8rw2\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.027857 4708 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 09:36:24.834487806 +0000 UTC Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.028286 4708 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6185h31m11.806204957s for next certificate rotation Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.062245 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.250353 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2"] Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.277636 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" event={"ID":"902d92e4-0099-45a9-8d18-13dfaa5613f6","Type":"ContainerStarted","Data":"f0885321abcec2a9077f1894f14353d930d8890fc10826a87e0f0565e893e643"} Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.286761 4708 generic.go:334] "Generic (PLEG): container finished" podID="0a247534-75a9-40c2-8777-22d5f7204364" containerID="b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be" exitCode=0 Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.286859 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" event={"ID":"0a247534-75a9-40c2-8777-22d5f7204364","Type":"ContainerDied","Data":"b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be"} Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.286897 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" event={"ID":"0a247534-75a9-40c2-8777-22d5f7204364","Type":"ContainerDied","Data":"28edc1e1ce80b19528e85e233d182cf4cf5cc9aa9c319ba28391a858ac972050"} Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.286929 4708 scope.go:117] "RemoveContainer" containerID="b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.287112 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d876b5d65-7h5pt" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.291094 4708 generic.go:334] "Generic (PLEG): container finished" podID="db015266-db58-4200-93a3-a5ace1c276ee" containerID="47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263" exitCode=0 Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.291405 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" event={"ID":"db015266-db58-4200-93a3-a5ace1c276ee","Type":"ContainerDied","Data":"47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263"} Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.291445 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.291464 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75" event={"ID":"db015266-db58-4200-93a3-a5ace1c276ee","Type":"ContainerDied","Data":"cd2bdb8a276558f96afe36b52ac7b98374fa014470c745e90a88feb8c71c7b71"} Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.308643 4708 scope.go:117] "RemoveContainer" containerID="b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be" Mar 20 16:05:13 crc kubenswrapper[4708]: E0320 16:05:13.310497 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be\": container with ID starting with b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be not found: ID does not exist" containerID="b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.320806 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be"} err="failed to get container status \"b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be\": rpc error: code = NotFound desc = could not find container \"b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be\": container with ID starting with b62936c5fe6cc5155814d8aeb33cf549eb4997db4dd87c10ba5ffc5872fa81be not found: ID does not exist" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.320857 4708 scope.go:117] "RemoveContainer" containerID="47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.318468 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d876b5d65-7h5pt"] Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.328176 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7d876b5d65-7h5pt"] Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.342871 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75"] Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.345594 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54cbdcf87b-w7p75"] Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.356703 4708 scope.go:117] "RemoveContainer" containerID="47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263" Mar 20 16:05:13 crc kubenswrapper[4708]: E0320 16:05:13.357220 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263\": container with ID starting with 47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263 not found: ID does not exist" containerID="47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.357278 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263"} err="failed to get container status \"47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263\": rpc error: code = NotFound desc = could not find container \"47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263\": container with ID starting with 47a379b97fe2db4e03e9739538397f434a61557ca2c924fe883c7662994f9263 not found: ID does not exist" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.481614 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.562491 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-8vwl5" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.577978 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71c31cfe-5dd9-42a4-899e-1cac6b1121f3-kube-api-access\") pod \"71c31cfe-5dd9-42a4-899e-1cac6b1121f3\" (UID: \"71c31cfe-5dd9-42a4-899e-1cac6b1121f3\") " Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.578255 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71c31cfe-5dd9-42a4-899e-1cac6b1121f3-kubelet-dir\") pod \"71c31cfe-5dd9-42a4-899e-1cac6b1121f3\" (UID: \"71c31cfe-5dd9-42a4-899e-1cac6b1121f3\") " Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.578826 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71c31cfe-5dd9-42a4-899e-1cac6b1121f3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "71c31cfe-5dd9-42a4-899e-1cac6b1121f3" (UID: "71c31cfe-5dd9-42a4-899e-1cac6b1121f3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.584284 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c31cfe-5dd9-42a4-899e-1cac6b1121f3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "71c31cfe-5dd9-42a4-899e-1cac6b1121f3" (UID: "71c31cfe-5dd9-42a4-899e-1cac6b1121f3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.679474 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hx92\" (UniqueName: \"kubernetes.io/projected/69fd6ec4-db77-4309-977a-7e80359aec50-kube-api-access-9hx92\") pod \"69fd6ec4-db77-4309-977a-7e80359aec50\" (UID: \"69fd6ec4-db77-4309-977a-7e80359aec50\") " Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.680113 4708 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71c31cfe-5dd9-42a4-899e-1cac6b1121f3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.680144 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71c31cfe-5dd9-42a4-899e-1cac6b1121f3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.686298 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69fd6ec4-db77-4309-977a-7e80359aec50-kube-api-access-9hx92" (OuterVolumeSpecName: "kube-api-access-9hx92") pod "69fd6ec4-db77-4309-977a-7e80359aec50" (UID: "69fd6ec4-db77-4309-977a-7e80359aec50"). InnerVolumeSpecName "kube-api-access-9hx92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:13 crc kubenswrapper[4708]: I0320 16:05:13.781741 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hx92\" (UniqueName: \"kubernetes.io/projected/69fd6ec4-db77-4309-977a-7e80359aec50-kube-api-access-9hx92\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.028958 4708 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-14 02:30:43.233405365 +0000 UTC Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.029005 4708 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6442h25m29.204403128s for next certificate rotation Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.119544 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a247534-75a9-40c2-8777-22d5f7204364" path="/var/lib/kubelet/pods/0a247534-75a9-40c2-8777-22d5f7204364/volumes" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.120390 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db015266-db58-4200-93a3-a5ace1c276ee" path="/var/lib/kubelet/pods/db015266-db58-4200-93a3-a5ace1c276ee/volumes" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.299115 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-8vwl5" event={"ID":"69fd6ec4-db77-4309-977a-7e80359aec50","Type":"ContainerDied","Data":"a11741a8b0168b98a71d31aacc099c827f38a9461d13989422569dc7538c97fd"} Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.299169 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a11741a8b0168b98a71d31aacc099c827f38a9461d13989422569dc7538c97fd" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.299130 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-8vwl5" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.305524 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" event={"ID":"902d92e4-0099-45a9-8d18-13dfaa5613f6","Type":"ContainerStarted","Data":"97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990"} Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.305709 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.312350 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71c31cfe-5dd9-42a4-899e-1cac6b1121f3","Type":"ContainerDied","Data":"9888ab78d3ee7a939b3fd5ea9de77a52d4f5b19fb8f90886fe33af4438547d52"} Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.312394 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9888ab78d3ee7a939b3fd5ea9de77a52d4f5b19fb8f90886fe33af4438547d52" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.312431 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.324787 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.360308 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" podStartSLOduration=13.360280597 podStartE2EDuration="13.360280597s" podCreationTimestamp="2026-03-20 16:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:14.341286111 +0000 UTC m=+269.015622876" watchObservedRunningTime="2026-03-20 16:05:14.360280597 +0000 UTC m=+269.034617302" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.960260 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q"] Mar 20 16:05:14 crc kubenswrapper[4708]: E0320 16:05:14.960986 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c31cfe-5dd9-42a4-899e-1cac6b1121f3" containerName="pruner" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.961007 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c31cfe-5dd9-42a4-899e-1cac6b1121f3" containerName="pruner" Mar 20 16:05:14 crc kubenswrapper[4708]: E0320 16:05:14.961025 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69fd6ec4-db77-4309-977a-7e80359aec50" containerName="oc" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.961040 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="69fd6ec4-db77-4309-977a-7e80359aec50" containerName="oc" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.961215 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="69fd6ec4-db77-4309-977a-7e80359aec50" containerName="oc" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.961237 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c31cfe-5dd9-42a4-899e-1cac6b1121f3" containerName="pruner" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.961833 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.966374 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.966426 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.966540 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.966573 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.966608 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.975053 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:05:14 crc kubenswrapper[4708]: I0320 16:05:14.991647 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q"] Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.001243 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbdc6\" (UniqueName: \"kubernetes.io/projected/ddd6f616-cd77-4533-8f61-08c26757fc8e-kube-api-access-vbdc6\") pod \"route-controller-manager-78cff4646c-npl8q\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.001323 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd6f616-cd77-4533-8f61-08c26757fc8e-serving-cert\") pod \"route-controller-manager-78cff4646c-npl8q\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.001518 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddd6f616-cd77-4533-8f61-08c26757fc8e-client-ca\") pod \"route-controller-manager-78cff4646c-npl8q\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.001558 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd6f616-cd77-4533-8f61-08c26757fc8e-config\") pod \"route-controller-manager-78cff4646c-npl8q\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.102662 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddd6f616-cd77-4533-8f61-08c26757fc8e-client-ca\") pod \"route-controller-manager-78cff4646c-npl8q\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.102726 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd6f616-cd77-4533-8f61-08c26757fc8e-config\") pod \"route-controller-manager-78cff4646c-npl8q\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.102809 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbdc6\" (UniqueName: \"kubernetes.io/projected/ddd6f616-cd77-4533-8f61-08c26757fc8e-kube-api-access-vbdc6\") pod \"route-controller-manager-78cff4646c-npl8q\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.102837 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd6f616-cd77-4533-8f61-08c26757fc8e-serving-cert\") pod \"route-controller-manager-78cff4646c-npl8q\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.104458 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd6f616-cd77-4533-8f61-08c26757fc8e-config\") pod \"route-controller-manager-78cff4646c-npl8q\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.104503 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddd6f616-cd77-4533-8f61-08c26757fc8e-client-ca\") pod \"route-controller-manager-78cff4646c-npl8q\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.127902 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd6f616-cd77-4533-8f61-08c26757fc8e-serving-cert\") pod \"route-controller-manager-78cff4646c-npl8q\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.132102 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbdc6\" (UniqueName: \"kubernetes.io/projected/ddd6f616-cd77-4533-8f61-08c26757fc8e-kube-api-access-vbdc6\") pod \"route-controller-manager-78cff4646c-npl8q\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.288264 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:15 crc kubenswrapper[4708]: I0320 16:05:15.525626 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q"] Mar 20 16:05:16 crc kubenswrapper[4708]: I0320 16:05:16.323366 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" event={"ID":"ddd6f616-cd77-4533-8f61-08c26757fc8e","Type":"ContainerStarted","Data":"4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279"} Mar 20 16:05:16 crc kubenswrapper[4708]: I0320 16:05:16.323754 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" event={"ID":"ddd6f616-cd77-4533-8f61-08c26757fc8e","Type":"ContainerStarted","Data":"004fdb5fe6c2ee17f3512098c5c968395d738b9de8bcb776da6057f4f5b52f7c"} Mar 20 16:05:16 crc kubenswrapper[4708]: I0320 16:05:16.345911 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" podStartSLOduration=15.345892227 podStartE2EDuration="15.345892227s" podCreationTimestamp="2026-03-20 16:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:16.342750838 +0000 UTC m=+271.017087583" watchObservedRunningTime="2026-03-20 16:05:16.345892227 +0000 UTC m=+271.020228942" Mar 20 16:05:17 crc kubenswrapper[4708]: I0320 16:05:17.330289 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:17 crc kubenswrapper[4708]: I0320 16:05:17.336634 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:19 crc kubenswrapper[4708]: I0320 16:05:19.344089 4708 generic.go:334] "Generic (PLEG): container finished" podID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" containerID="da500285cf7e4fe6bc601347a95f51f1d90603f638db9f179c7d9f1be108db93" exitCode=0 Mar 20 16:05:19 crc kubenswrapper[4708]: I0320 16:05:19.344160 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzncs" event={"ID":"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d","Type":"ContainerDied","Data":"da500285cf7e4fe6bc601347a95f51f1d90603f638db9f179c7d9f1be108db93"} Mar 20 16:05:20 crc kubenswrapper[4708]: I0320 16:05:20.353959 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6675d" event={"ID":"b7a193fd-f6c4-4779-ba32-d746f05094c1","Type":"ContainerStarted","Data":"847e2c5db3c23157f869c572527b0675320960e84349141fae6dab5aa058fc46"} Mar 20 16:05:20 crc kubenswrapper[4708]: I0320 16:05:20.359134 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzncs" event={"ID":"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d","Type":"ContainerStarted","Data":"559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac"} Mar 20 16:05:21 crc kubenswrapper[4708]: I0320 16:05:21.364720 4708 generic.go:334] "Generic (PLEG): container finished" podID="b7a193fd-f6c4-4779-ba32-d746f05094c1" containerID="847e2c5db3c23157f869c572527b0675320960e84349141fae6dab5aa058fc46" exitCode=0 Mar 20 16:05:21 crc kubenswrapper[4708]: I0320 16:05:21.364797 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6675d" event={"ID":"b7a193fd-f6c4-4779-ba32-d746f05094c1","Type":"ContainerDied","Data":"847e2c5db3c23157f869c572527b0675320960e84349141fae6dab5aa058fc46"} Mar 20 16:05:21 crc kubenswrapper[4708]: I0320 16:05:21.383690 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vzncs" podStartSLOduration=4.207303734 podStartE2EDuration="56.383655853s" podCreationTimestamp="2026-03-20 16:04:25 +0000 UTC" firstStartedPulling="2026-03-20 16:04:27.642936998 +0000 UTC m=+222.317273713" lastFinishedPulling="2026-03-20 16:05:19.819289117 +0000 UTC m=+274.493625832" observedRunningTime="2026-03-20 16:05:20.400199822 +0000 UTC m=+275.074536537" watchObservedRunningTime="2026-03-20 16:05:21.383655853 +0000 UTC m=+276.057992568" Mar 20 16:05:21 crc kubenswrapper[4708]: I0320 16:05:21.489167 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2"] Mar 20 16:05:21 crc kubenswrapper[4708]: I0320 16:05:21.489611 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" podUID="902d92e4-0099-45a9-8d18-13dfaa5613f6" containerName="controller-manager" containerID="cri-o://97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990" gracePeriod=30 Mar 20 16:05:21 crc kubenswrapper[4708]: I0320 16:05:21.505970 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q"] Mar 20 16:05:21 crc kubenswrapper[4708]: I0320 16:05:21.506197 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" podUID="ddd6f616-cd77-4533-8f61-08c26757fc8e" containerName="route-controller-manager" containerID="cri-o://4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279" gracePeriod=30 Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.000156 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.087174 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.104647 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbdc6\" (UniqueName: \"kubernetes.io/projected/ddd6f616-cd77-4533-8f61-08c26757fc8e-kube-api-access-vbdc6\") pod \"ddd6f616-cd77-4533-8f61-08c26757fc8e\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.104739 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd6f616-cd77-4533-8f61-08c26757fc8e-config\") pod \"ddd6f616-cd77-4533-8f61-08c26757fc8e\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.104788 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddd6f616-cd77-4533-8f61-08c26757fc8e-client-ca\") pod \"ddd6f616-cd77-4533-8f61-08c26757fc8e\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.104874 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd6f616-cd77-4533-8f61-08c26757fc8e-serving-cert\") pod \"ddd6f616-cd77-4533-8f61-08c26757fc8e\" (UID: \"ddd6f616-cd77-4533-8f61-08c26757fc8e\") " Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.105607 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd6f616-cd77-4533-8f61-08c26757fc8e-client-ca" (OuterVolumeSpecName: "client-ca") pod "ddd6f616-cd77-4533-8f61-08c26757fc8e" (UID: "ddd6f616-cd77-4533-8f61-08c26757fc8e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.105687 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd6f616-cd77-4533-8f61-08c26757fc8e-config" (OuterVolumeSpecName: "config") pod "ddd6f616-cd77-4533-8f61-08c26757fc8e" (UID: "ddd6f616-cd77-4533-8f61-08c26757fc8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.110240 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd6f616-cd77-4533-8f61-08c26757fc8e-kube-api-access-vbdc6" (OuterVolumeSpecName: "kube-api-access-vbdc6") pod "ddd6f616-cd77-4533-8f61-08c26757fc8e" (UID: "ddd6f616-cd77-4533-8f61-08c26757fc8e"). InnerVolumeSpecName "kube-api-access-vbdc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.118003 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd6f616-cd77-4533-8f61-08c26757fc8e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ddd6f616-cd77-4533-8f61-08c26757fc8e" (UID: "ddd6f616-cd77-4533-8f61-08c26757fc8e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.205965 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-config\") pod \"902d92e4-0099-45a9-8d18-13dfaa5613f6\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.206051 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq54s\" (UniqueName: \"kubernetes.io/projected/902d92e4-0099-45a9-8d18-13dfaa5613f6-kube-api-access-mq54s\") pod \"902d92e4-0099-45a9-8d18-13dfaa5613f6\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.206130 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-proxy-ca-bundles\") pod \"902d92e4-0099-45a9-8d18-13dfaa5613f6\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.206153 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-client-ca\") pod \"902d92e4-0099-45a9-8d18-13dfaa5613f6\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.206191 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902d92e4-0099-45a9-8d18-13dfaa5613f6-serving-cert\") pod \"902d92e4-0099-45a9-8d18-13dfaa5613f6\" (UID: \"902d92e4-0099-45a9-8d18-13dfaa5613f6\") " Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.206450 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd6f616-cd77-4533-8f61-08c26757fc8e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.206466 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbdc6\" (UniqueName: \"kubernetes.io/projected/ddd6f616-cd77-4533-8f61-08c26757fc8e-kube-api-access-vbdc6\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.206477 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd6f616-cd77-4533-8f61-08c26757fc8e-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.206485 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddd6f616-cd77-4533-8f61-08c26757fc8e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.207425 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-config" (OuterVolumeSpecName: "config") pod "902d92e4-0099-45a9-8d18-13dfaa5613f6" (UID: "902d92e4-0099-45a9-8d18-13dfaa5613f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.207805 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "902d92e4-0099-45a9-8d18-13dfaa5613f6" (UID: "902d92e4-0099-45a9-8d18-13dfaa5613f6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.210300 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-client-ca" (OuterVolumeSpecName: "client-ca") pod "902d92e4-0099-45a9-8d18-13dfaa5613f6" (UID: "902d92e4-0099-45a9-8d18-13dfaa5613f6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.210662 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902d92e4-0099-45a9-8d18-13dfaa5613f6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "902d92e4-0099-45a9-8d18-13dfaa5613f6" (UID: "902d92e4-0099-45a9-8d18-13dfaa5613f6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.210714 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902d92e4-0099-45a9-8d18-13dfaa5613f6-kube-api-access-mq54s" (OuterVolumeSpecName: "kube-api-access-mq54s") pod "902d92e4-0099-45a9-8d18-13dfaa5613f6" (UID: "902d92e4-0099-45a9-8d18-13dfaa5613f6"). InnerVolumeSpecName "kube-api-access-mq54s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.307839 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902d92e4-0099-45a9-8d18-13dfaa5613f6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.307895 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.307907 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq54s\" (UniqueName: \"kubernetes.io/projected/902d92e4-0099-45a9-8d18-13dfaa5613f6-kube-api-access-mq54s\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.307920 4708 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.307930 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902d92e4-0099-45a9-8d18-13dfaa5613f6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.371094 4708 generic.go:334] "Generic (PLEG): container finished" podID="ddd6f616-cd77-4533-8f61-08c26757fc8e" containerID="4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279" exitCode=0 Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.371158 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.371163 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" event={"ID":"ddd6f616-cd77-4533-8f61-08c26757fc8e","Type":"ContainerDied","Data":"4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279"} Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.371239 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q" event={"ID":"ddd6f616-cd77-4533-8f61-08c26757fc8e","Type":"ContainerDied","Data":"004fdb5fe6c2ee17f3512098c5c968395d738b9de8bcb776da6057f4f5b52f7c"} Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.371262 4708 scope.go:117] "RemoveContainer" containerID="4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.375160 4708 generic.go:334] "Generic (PLEG): container finished" podID="902d92e4-0099-45a9-8d18-13dfaa5613f6" containerID="97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990" exitCode=0 Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.375225 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.375241 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" event={"ID":"902d92e4-0099-45a9-8d18-13dfaa5613f6","Type":"ContainerDied","Data":"97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990"} Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.375269 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2" event={"ID":"902d92e4-0099-45a9-8d18-13dfaa5613f6","Type":"ContainerDied","Data":"f0885321abcec2a9077f1894f14353d930d8890fc10826a87e0f0565e893e643"} Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.380633 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6675d" event={"ID":"b7a193fd-f6c4-4779-ba32-d746f05094c1","Type":"ContainerStarted","Data":"2c539e24098ea7ca2df9e9168c7e597fba9917c25ac33cca7b4da8d1e4540a5c"} Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.391830 4708 scope.go:117] "RemoveContainer" containerID="4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279" Mar 20 16:05:22 crc kubenswrapper[4708]: E0320 16:05:22.392485 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279\": container with ID starting with 4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279 not found: ID does not exist" containerID="4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.392526 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279"} err="failed to get container status \"4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279\": rpc error: code = NotFound desc = could not find container \"4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279\": container with ID starting with 4eebd6312477a861b43274850b64565d50f9a8796dc6bfa5602a39b30cedf279 not found: ID does not exist" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.392552 4708 scope.go:117] "RemoveContainer" containerID="97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.397914 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q"] Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.405099 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78cff4646c-npl8q"] Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.436722 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6675d" podStartSLOduration=2.5333141660000003 podStartE2EDuration="54.436703059s" podCreationTimestamp="2026-03-20 16:04:28 +0000 UTC" firstStartedPulling="2026-03-20 16:04:29.913883662 +0000 UTC m=+224.588220377" lastFinishedPulling="2026-03-20 16:05:21.817272555 +0000 UTC m=+276.491609270" observedRunningTime="2026-03-20 16:05:22.425145952 +0000 UTC m=+277.099482667" watchObservedRunningTime="2026-03-20 16:05:22.436703059 +0000 UTC m=+277.111039774" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.438115 4708 scope.go:117] "RemoveContainer" containerID="97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990" Mar 20 16:05:22 crc kubenswrapper[4708]: E0320 16:05:22.441847 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990\": container with ID starting with 97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990 not found: ID does not exist" containerID="97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.441894 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990"} err="failed to get container status \"97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990\": rpc error: code = NotFound desc = could not find container \"97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990\": container with ID starting with 97b7875fe40bca1d9322970862a6ee13a563df57ccd454e1512e3ae440661990 not found: ID does not exist" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.449452 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2"] Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.457472 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7987d8fbc7-l8rw2"] Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.962748 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d"] Mar 20 16:05:22 crc kubenswrapper[4708]: E0320 16:05:22.963088 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902d92e4-0099-45a9-8d18-13dfaa5613f6" containerName="controller-manager" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.963122 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="902d92e4-0099-45a9-8d18-13dfaa5613f6" containerName="controller-manager" Mar 20 16:05:22 crc kubenswrapper[4708]: E0320 16:05:22.963136 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd6f616-cd77-4533-8f61-08c26757fc8e" containerName="route-controller-manager" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.963142 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd6f616-cd77-4533-8f61-08c26757fc8e" containerName="route-controller-manager" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.963280 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="902d92e4-0099-45a9-8d18-13dfaa5613f6" containerName="controller-manager" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.963299 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd6f616-cd77-4533-8f61-08c26757fc8e" containerName="route-controller-manager" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.963828 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.965574 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.965717 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.966062 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bc8477c76-2cm2t"] Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.966509 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.966786 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.966824 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.966921 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.969225 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.969376 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.969478 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.969578 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.969700 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.970446 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.972231 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.977285 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.979587 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d"] Mar 20 16:05:22 crc kubenswrapper[4708]: I0320 16:05:22.982201 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bc8477c76-2cm2t"] Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.018850 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9719359f-a579-4083-b292-f0ed2a00dbb4-client-ca\") pod \"route-controller-manager-67b95f64d8-zr55d\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.018908 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/140fa064-5588-4720-b6d8-2caf85fdf0ee-serving-cert\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.018937 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9719359f-a579-4083-b292-f0ed2a00dbb4-config\") pod \"route-controller-manager-67b95f64d8-zr55d\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.019107 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9719359f-a579-4083-b292-f0ed2a00dbb4-serving-cert\") pod \"route-controller-manager-67b95f64d8-zr55d\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.019154 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-config\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.019183 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-client-ca\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.019309 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-proxy-ca-bundles\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.019477 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6jv2\" (UniqueName: \"kubernetes.io/projected/9719359f-a579-4083-b292-f0ed2a00dbb4-kube-api-access-j6jv2\") pod \"route-controller-manager-67b95f64d8-zr55d\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.019510 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x92js\" (UniqueName: \"kubernetes.io/projected/140fa064-5588-4720-b6d8-2caf85fdf0ee-kube-api-access-x92js\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.121495 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-config\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.121784 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-client-ca\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.121908 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-proxy-ca-bundles\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.122057 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6jv2\" (UniqueName: \"kubernetes.io/projected/9719359f-a579-4083-b292-f0ed2a00dbb4-kube-api-access-j6jv2\") pod \"route-controller-manager-67b95f64d8-zr55d\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.122143 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x92js\" (UniqueName: \"kubernetes.io/projected/140fa064-5588-4720-b6d8-2caf85fdf0ee-kube-api-access-x92js\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.122246 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9719359f-a579-4083-b292-f0ed2a00dbb4-client-ca\") pod \"route-controller-manager-67b95f64d8-zr55d\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.122317 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/140fa064-5588-4720-b6d8-2caf85fdf0ee-serving-cert\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.122401 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9719359f-a579-4083-b292-f0ed2a00dbb4-config\") pod \"route-controller-manager-67b95f64d8-zr55d\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.122506 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9719359f-a579-4083-b292-f0ed2a00dbb4-serving-cert\") pod \"route-controller-manager-67b95f64d8-zr55d\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.122988 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-config\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.123254 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9719359f-a579-4083-b292-f0ed2a00dbb4-client-ca\") pod \"route-controller-manager-67b95f64d8-zr55d\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.123718 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9719359f-a579-4083-b292-f0ed2a00dbb4-config\") pod \"route-controller-manager-67b95f64d8-zr55d\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.124055 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-client-ca\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.124131 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-proxy-ca-bundles\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.126711 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9719359f-a579-4083-b292-f0ed2a00dbb4-serving-cert\") pod \"route-controller-manager-67b95f64d8-zr55d\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.127163 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/140fa064-5588-4720-b6d8-2caf85fdf0ee-serving-cert\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.142448 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x92js\" (UniqueName: \"kubernetes.io/projected/140fa064-5588-4720-b6d8-2caf85fdf0ee-kube-api-access-x92js\") pod \"controller-manager-bc8477c76-2cm2t\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.145023 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6jv2\" (UniqueName: \"kubernetes.io/projected/9719359f-a579-4083-b292-f0ed2a00dbb4-kube-api-access-j6jv2\") pod \"route-controller-manager-67b95f64d8-zr55d\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.282029 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.289271 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.413397 4708 generic.go:334] "Generic (PLEG): container finished" podID="bece7d1b-b5d8-4762-b0b8-b2752c422776" containerID="8d8460ba1a16625147ea48db10e794e885cd0575b6f67ab586295f45bdad869a" exitCode=0 Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.413803 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvrxr" event={"ID":"bece7d1b-b5d8-4762-b0b8-b2752c422776","Type":"ContainerDied","Data":"8d8460ba1a16625147ea48db10e794e885cd0575b6f67ab586295f45bdad869a"} Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.438191 4708 generic.go:334] "Generic (PLEG): container finished" podID="ca697a79-760d-4ae1-827f-bc2b0aee1785" containerID="ae8e68f00501ea8d8f0dbd6bdea9c80b0f6885a954ebd4415f744adb99d9b9cb" exitCode=0 Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.438228 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4kcq" event={"ID":"ca697a79-760d-4ae1-827f-bc2b0aee1785","Type":"ContainerDied","Data":"ae8e68f00501ea8d8f0dbd6bdea9c80b0f6885a954ebd4415f744adb99d9b9cb"} Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.506334 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d"] Mar 20 16:05:23 crc kubenswrapper[4708]: I0320 16:05:23.552301 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bc8477c76-2cm2t"] Mar 20 16:05:23 crc kubenswrapper[4708]: W0320 16:05:23.558153 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod140fa064_5588_4720_b6d8_2caf85fdf0ee.slice/crio-8902d3c68998b275d8a10e8895e4491099eaef8af634d9de1b42cdd504048ebe WatchSource:0}: Error finding container 8902d3c68998b275d8a10e8895e4491099eaef8af634d9de1b42cdd504048ebe: Status 404 returned error can't find the container with id 8902d3c68998b275d8a10e8895e4491099eaef8af634d9de1b42cdd504048ebe Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.117307 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902d92e4-0099-45a9-8d18-13dfaa5613f6" path="/var/lib/kubelet/pods/902d92e4-0099-45a9-8d18-13dfaa5613f6/volumes" Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.118232 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd6f616-cd77-4533-8f61-08c26757fc8e" path="/var/lib/kubelet/pods/ddd6f616-cd77-4533-8f61-08c26757fc8e/volumes" Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.444684 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" event={"ID":"9719359f-a579-4083-b292-f0ed2a00dbb4","Type":"ContainerStarted","Data":"7e31b48abc1a043b5a857f37a8913f2ff6472e3eb1736930ac330a8ed2311155"} Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.444736 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" event={"ID":"9719359f-a579-4083-b292-f0ed2a00dbb4","Type":"ContainerStarted","Data":"133942e6fdfb2dd120fbe40eabdc867d241d149bb7b1c445b2b2e1fd01831544"} Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.445102 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.446374 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" event={"ID":"140fa064-5588-4720-b6d8-2caf85fdf0ee","Type":"ContainerStarted","Data":"8b92e066605900d7bbe2d61ad2622e58079076e0dee57c2f0fba50a1d28e3a8d"} Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.446395 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" event={"ID":"140fa064-5588-4720-b6d8-2caf85fdf0ee","Type":"ContainerStarted","Data":"8902d3c68998b275d8a10e8895e4491099eaef8af634d9de1b42cdd504048ebe"} Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.447080 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.449164 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvrxr" event={"ID":"bece7d1b-b5d8-4762-b0b8-b2752c422776","Type":"ContainerStarted","Data":"f2a34addac5a89b21226f95fb7b2d9084d247826891a506d82e1cd76b96ec49a"} Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.451470 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.451630 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4kcq" event={"ID":"ca697a79-760d-4ae1-827f-bc2b0aee1785","Type":"ContainerStarted","Data":"e8dbbe84fc93f1ae74677ceebfb74b3d3cc8538080306d1efb7bdb4bae852489"} Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.452401 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.464605 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" podStartSLOduration=3.464546061 podStartE2EDuration="3.464546061s" podCreationTimestamp="2026-03-20 16:05:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:24.462766981 +0000 UTC m=+279.137103696" watchObservedRunningTime="2026-03-20 16:05:24.464546061 +0000 UTC m=+279.138882776" Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.493338 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h4kcq" podStartSLOduration=4.153933092 podStartE2EDuration="59.493317224s" podCreationTimestamp="2026-03-20 16:04:25 +0000 UTC" firstStartedPulling="2026-03-20 16:04:28.683490078 +0000 UTC m=+223.357826793" lastFinishedPulling="2026-03-20 16:05:24.02287421 +0000 UTC m=+278.697210925" observedRunningTime="2026-03-20 16:05:24.488875248 +0000 UTC m=+279.163211973" watchObservedRunningTime="2026-03-20 16:05:24.493317224 +0000 UTC m=+279.167653959" Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.545292 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" podStartSLOduration=3.5452707930000003 podStartE2EDuration="3.545270793s" podCreationTimestamp="2026-03-20 16:05:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:24.537898104 +0000 UTC m=+279.212234819" watchObservedRunningTime="2026-03-20 16:05:24.545270793 +0000 UTC m=+279.219607528" Mar 20 16:05:24 crc kubenswrapper[4708]: I0320 16:05:24.561489 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hvrxr" podStartSLOduration=3.402321895 podStartE2EDuration="57.56147316s" podCreationTimestamp="2026-03-20 16:04:27 +0000 UTC" firstStartedPulling="2026-03-20 16:04:29.815189494 +0000 UTC m=+224.489526199" lastFinishedPulling="2026-03-20 16:05:23.974340749 +0000 UTC m=+278.648677464" observedRunningTime="2026-03-20 16:05:24.560158463 +0000 UTC m=+279.234495188" watchObservedRunningTime="2026-03-20 16:05:24.56147316 +0000 UTC m=+279.235809875" Mar 20 16:05:25 crc kubenswrapper[4708]: I0320 16:05:25.459388 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmd2p" event={"ID":"b626637b-4d54-4743-9717-142fe62e392b","Type":"ContainerStarted","Data":"8a413a15bc0a13e88feec4ccd3142c0287c5bc5f7717d843d72eff10c693483d"} Mar 20 16:05:25 crc kubenswrapper[4708]: I0320 16:05:25.461529 4708 generic.go:334] "Generic (PLEG): container finished" podID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" containerID="9418d2856bd412b98423cdb844e43ac8eb2e4cf1733792cba0da9a24bc63fba5" exitCode=0 Mar 20 16:05:25 crc kubenswrapper[4708]: I0320 16:05:25.461590 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frmt" event={"ID":"cdc1f7ff-9725-4049-b06c-50a4adfa1696","Type":"ContainerDied","Data":"9418d2856bd412b98423cdb844e43ac8eb2e4cf1733792cba0da9a24bc63fba5"} Mar 20 16:05:25 crc kubenswrapper[4708]: I0320 16:05:25.463799 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr9ct" event={"ID":"18d7f096-fc87-4f68-959e-5ab803b7e097","Type":"ContainerStarted","Data":"5e055af0e94e4ef34fff89b9b072c5576d23cc1c554f71a4cadadb9eaaf946e6"} Mar 20 16:05:25 crc kubenswrapper[4708]: I0320 16:05:25.467045 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk7n9" event={"ID":"65dd12f7-e1af-4213-b264-d846c01eaba8","Type":"ContainerStarted","Data":"e3c1ff683b9b4f531f0a21098f0c453ea353c1e9a70077dd8c40d3f4990f374d"} Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.035752 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.035824 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.178426 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.178495 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.178543 4708 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.179139 4708 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1"} pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.179195 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" containerID="cri-o://d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1" gracePeriod=600 Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.259809 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.259875 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.476520 4708 generic.go:334] "Generic (PLEG): container finished" podID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerID="d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1" exitCode=0 Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.476597 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerDied","Data":"d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1"} Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.479738 4708 generic.go:334] "Generic (PLEG): container finished" podID="b626637b-4d54-4743-9717-142fe62e392b" containerID="8a413a15bc0a13e88feec4ccd3142c0287c5bc5f7717d843d72eff10c693483d" exitCode=0 Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.479805 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmd2p" event={"ID":"b626637b-4d54-4743-9717-142fe62e392b","Type":"ContainerDied","Data":"8a413a15bc0a13e88feec4ccd3142c0287c5bc5f7717d843d72eff10c693483d"} Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.486643 4708 generic.go:334] "Generic (PLEG): container finished" podID="18d7f096-fc87-4f68-959e-5ab803b7e097" containerID="5e055af0e94e4ef34fff89b9b072c5576d23cc1c554f71a4cadadb9eaaf946e6" exitCode=0 Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.486760 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr9ct" event={"ID":"18d7f096-fc87-4f68-959e-5ab803b7e097","Type":"ContainerDied","Data":"5e055af0e94e4ef34fff89b9b072c5576d23cc1c554f71a4cadadb9eaaf946e6"} Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.490147 4708 generic.go:334] "Generic (PLEG): container finished" podID="65dd12f7-e1af-4213-b264-d846c01eaba8" containerID="e3c1ff683b9b4f531f0a21098f0c453ea353c1e9a70077dd8c40d3f4990f374d" exitCode=0 Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.490243 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk7n9" event={"ID":"65dd12f7-e1af-4213-b264-d846c01eaba8","Type":"ContainerDied","Data":"e3c1ff683b9b4f531f0a21098f0c453ea353c1e9a70077dd8c40d3f4990f374d"} Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.514832 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.527000 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:05:26 crc kubenswrapper[4708]: I0320 16:05:26.572765 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:05:27 crc kubenswrapper[4708]: I0320 16:05:27.499362 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerStarted","Data":"340f6ff788bab07854dcb09b786205fe1caea2674ea8aaca989428a747316c8f"} Mar 20 16:05:27 crc kubenswrapper[4708]: I0320 16:05:27.992963 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:05:27 crc kubenswrapper[4708]: I0320 16:05:27.993506 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:05:28 crc kubenswrapper[4708]: I0320 16:05:28.043598 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:05:28 crc kubenswrapper[4708]: I0320 16:05:28.368223 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:05:28 crc kubenswrapper[4708]: I0320 16:05:28.368283 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:05:28 crc kubenswrapper[4708]: I0320 16:05:28.417326 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:05:28 crc kubenswrapper[4708]: I0320 16:05:28.551103 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:05:29 crc kubenswrapper[4708]: I0320 16:05:29.511995 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmd2p" event={"ID":"b626637b-4d54-4743-9717-142fe62e392b","Type":"ContainerStarted","Data":"f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd"} Mar 20 16:05:29 crc kubenswrapper[4708]: I0320 16:05:29.514282 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frmt" event={"ID":"cdc1f7ff-9725-4049-b06c-50a4adfa1696","Type":"ContainerStarted","Data":"cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111"} Mar 20 16:05:29 crc kubenswrapper[4708]: I0320 16:05:29.516114 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr9ct" event={"ID":"18d7f096-fc87-4f68-959e-5ab803b7e097","Type":"ContainerStarted","Data":"c0893ce7d1b0d3399d4c15fbd8d9b1b0c2d4a5713addd02e7cc06144794961d3"} Mar 20 16:05:29 crc kubenswrapper[4708]: I0320 16:05:29.517928 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk7n9" event={"ID":"65dd12f7-e1af-4213-b264-d846c01eaba8","Type":"ContainerStarted","Data":"990ce66c456a25f1e9c43791a2bc32a15760f51b4c2b988e1086fed7550e3004"} Mar 20 16:05:29 crc kubenswrapper[4708]: I0320 16:05:29.551920 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pmd2p" podStartSLOduration=3.332062726 podStartE2EDuration="1m3.551899548s" podCreationTimestamp="2026-03-20 16:04:26 +0000 UTC" firstStartedPulling="2026-03-20 16:04:28.689571381 +0000 UTC m=+223.363908106" lastFinishedPulling="2026-03-20 16:05:28.909408193 +0000 UTC m=+283.583744928" observedRunningTime="2026-03-20 16:05:29.534762314 +0000 UTC m=+284.209099029" watchObservedRunningTime="2026-03-20 16:05:29.551899548 +0000 UTC m=+284.226236263" Mar 20 16:05:29 crc kubenswrapper[4708]: I0320 16:05:29.573233 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mr9ct" podStartSLOduration=2.366596236 podStartE2EDuration="1m0.573210381s" podCreationTimestamp="2026-03-20 16:04:29 +0000 UTC" firstStartedPulling="2026-03-20 16:04:30.910998891 +0000 UTC m=+225.585335606" lastFinishedPulling="2026-03-20 16:05:29.117613046 +0000 UTC m=+283.791949751" observedRunningTime="2026-03-20 16:05:29.552601798 +0000 UTC m=+284.226938513" watchObservedRunningTime="2026-03-20 16:05:29.573210381 +0000 UTC m=+284.247547096" Mar 20 16:05:29 crc kubenswrapper[4708]: I0320 16:05:29.574526 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5frmt" podStartSLOduration=3.99559459 podStartE2EDuration="1m3.574521088s" podCreationTimestamp="2026-03-20 16:04:26 +0000 UTC" firstStartedPulling="2026-03-20 16:04:28.718998706 +0000 UTC m=+223.393335421" lastFinishedPulling="2026-03-20 16:05:28.297925204 +0000 UTC m=+282.972261919" observedRunningTime="2026-03-20 16:05:29.572019247 +0000 UTC m=+284.246355962" watchObservedRunningTime="2026-03-20 16:05:29.574521088 +0000 UTC m=+284.248857803" Mar 20 16:05:29 crc kubenswrapper[4708]: I0320 16:05:29.597493 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dk7n9" podStartSLOduration=2.605505285 podStartE2EDuration="1m0.597477316s" podCreationTimestamp="2026-03-20 16:04:29 +0000 UTC" firstStartedPulling="2026-03-20 16:04:30.902401757 +0000 UTC m=+225.576738472" lastFinishedPulling="2026-03-20 16:05:28.894373788 +0000 UTC m=+283.568710503" observedRunningTime="2026-03-20 16:05:29.597365213 +0000 UTC m=+284.271701928" watchObservedRunningTime="2026-03-20 16:05:29.597477316 +0000 UTC m=+284.271814031" Mar 20 16:05:29 crc kubenswrapper[4708]: I0320 16:05:29.778419 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:05:29 crc kubenswrapper[4708]: I0320 16:05:29.778484 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:05:30 crc kubenswrapper[4708]: I0320 16:05:30.841044 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mr9ct" podUID="18d7f096-fc87-4f68-959e-5ab803b7e097" containerName="registry-server" probeResult="failure" output=< Mar 20 16:05:30 crc kubenswrapper[4708]: timeout: failed to connect service ":50051" within 1s Mar 20 16:05:30 crc kubenswrapper[4708]: > Mar 20 16:05:32 crc kubenswrapper[4708]: I0320 16:05:32.347857 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6675d"] Mar 20 16:05:32 crc kubenswrapper[4708]: I0320 16:05:32.348408 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6675d" podUID="b7a193fd-f6c4-4779-ba32-d746f05094c1" containerName="registry-server" containerID="cri-o://2c539e24098ea7ca2df9e9168c7e597fba9917c25ac33cca7b4da8d1e4540a5c" gracePeriod=2 Mar 20 16:05:32 crc kubenswrapper[4708]: I0320 16:05:32.539875 4708 generic.go:334] "Generic (PLEG): container finished" podID="b7a193fd-f6c4-4779-ba32-d746f05094c1" containerID="2c539e24098ea7ca2df9e9168c7e597fba9917c25ac33cca7b4da8d1e4540a5c" exitCode=0 Mar 20 16:05:32 crc kubenswrapper[4708]: I0320 16:05:32.539985 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6675d" event={"ID":"b7a193fd-f6c4-4779-ba32-d746f05094c1","Type":"ContainerDied","Data":"2c539e24098ea7ca2df9e9168c7e597fba9917c25ac33cca7b4da8d1e4540a5c"} Mar 20 16:05:32 crc kubenswrapper[4708]: I0320 16:05:32.827399 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:05:32 crc kubenswrapper[4708]: I0320 16:05:32.957062 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhc5p\" (UniqueName: \"kubernetes.io/projected/b7a193fd-f6c4-4779-ba32-d746f05094c1-kube-api-access-hhc5p\") pod \"b7a193fd-f6c4-4779-ba32-d746f05094c1\" (UID: \"b7a193fd-f6c4-4779-ba32-d746f05094c1\") " Mar 20 16:05:32 crc kubenswrapper[4708]: I0320 16:05:32.957454 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a193fd-f6c4-4779-ba32-d746f05094c1-utilities\") pod \"b7a193fd-f6c4-4779-ba32-d746f05094c1\" (UID: \"b7a193fd-f6c4-4779-ba32-d746f05094c1\") " Mar 20 16:05:32 crc kubenswrapper[4708]: I0320 16:05:32.957613 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a193fd-f6c4-4779-ba32-d746f05094c1-catalog-content\") pod \"b7a193fd-f6c4-4779-ba32-d746f05094c1\" (UID: \"b7a193fd-f6c4-4779-ba32-d746f05094c1\") " Mar 20 16:05:32 crc kubenswrapper[4708]: I0320 16:05:32.958952 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a193fd-f6c4-4779-ba32-d746f05094c1-utilities" (OuterVolumeSpecName: "utilities") pod "b7a193fd-f6c4-4779-ba32-d746f05094c1" (UID: "b7a193fd-f6c4-4779-ba32-d746f05094c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:05:32 crc kubenswrapper[4708]: I0320 16:05:32.976455 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a193fd-f6c4-4779-ba32-d746f05094c1-kube-api-access-hhc5p" (OuterVolumeSpecName: "kube-api-access-hhc5p") pod "b7a193fd-f6c4-4779-ba32-d746f05094c1" (UID: "b7a193fd-f6c4-4779-ba32-d746f05094c1"). InnerVolumeSpecName "kube-api-access-hhc5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:32 crc kubenswrapper[4708]: I0320 16:05:32.984281 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a193fd-f6c4-4779-ba32-d746f05094c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7a193fd-f6c4-4779-ba32-d746f05094c1" (UID: "b7a193fd-f6c4-4779-ba32-d746f05094c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:05:33 crc kubenswrapper[4708]: I0320 16:05:33.059656 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a193fd-f6c4-4779-ba32-d746f05094c1-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:33 crc kubenswrapper[4708]: I0320 16:05:33.060052 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a193fd-f6c4-4779-ba32-d746f05094c1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:33 crc kubenswrapper[4708]: I0320 16:05:33.060160 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhc5p\" (UniqueName: \"kubernetes.io/projected/b7a193fd-f6c4-4779-ba32-d746f05094c1-kube-api-access-hhc5p\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:33 crc kubenswrapper[4708]: I0320 16:05:33.547193 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6675d" event={"ID":"b7a193fd-f6c4-4779-ba32-d746f05094c1","Type":"ContainerDied","Data":"ce1a7b356f057c3847de69a10db943fdeedf69ff04ff8f0f11676509a6b08dd8"} Mar 20 16:05:33 crc kubenswrapper[4708]: I0320 16:05:33.547590 4708 scope.go:117] "RemoveContainer" containerID="2c539e24098ea7ca2df9e9168c7e597fba9917c25ac33cca7b4da8d1e4540a5c" Mar 20 16:05:33 crc kubenswrapper[4708]: I0320 16:05:33.547260 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6675d" Mar 20 16:05:33 crc kubenswrapper[4708]: I0320 16:05:33.569432 4708 scope.go:117] "RemoveContainer" containerID="847e2c5db3c23157f869c572527b0675320960e84349141fae6dab5aa058fc46" Mar 20 16:05:33 crc kubenswrapper[4708]: I0320 16:05:33.585590 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6675d"] Mar 20 16:05:33 crc kubenswrapper[4708]: I0320 16:05:33.589047 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6675d"] Mar 20 16:05:33 crc kubenswrapper[4708]: I0320 16:05:33.608517 4708 scope.go:117] "RemoveContainer" containerID="675abf9754bd968001e9c24811fd424b4d02b0f8c0364bbf8169646da84a493d" Mar 20 16:05:34 crc kubenswrapper[4708]: I0320 16:05:34.118656 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a193fd-f6c4-4779-ba32-d746f05094c1" path="/var/lib/kubelet/pods/b7a193fd-f6c4-4779-ba32-d746f05094c1/volumes" Mar 20 16:05:36 crc kubenswrapper[4708]: I0320 16:05:36.330934 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:05:36 crc kubenswrapper[4708]: I0320 16:05:36.494228 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:05:36 crc kubenswrapper[4708]: I0320 16:05:36.494291 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:05:36 crc kubenswrapper[4708]: I0320 16:05:36.534078 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:05:36 crc kubenswrapper[4708]: I0320 16:05:36.637159 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:05:36 crc kubenswrapper[4708]: I0320 16:05:36.666743 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:05:36 crc kubenswrapper[4708]: I0320 16:05:36.666792 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:05:36 crc kubenswrapper[4708]: I0320 16:05:36.704573 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:05:37 crc kubenswrapper[4708]: I0320 16:05:37.620022 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:05:38 crc kubenswrapper[4708]: I0320 16:05:38.035902 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:05:38 crc kubenswrapper[4708]: I0320 16:05:38.347061 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pmd2p"] Mar 20 16:05:38 crc kubenswrapper[4708]: I0320 16:05:38.590126 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pmd2p" podUID="b626637b-4d54-4743-9717-142fe62e392b" containerName="registry-server" containerID="cri-o://f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd" gracePeriod=2 Mar 20 16:05:38 crc kubenswrapper[4708]: I0320 16:05:38.945356 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5frmt"] Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.036203 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.160056 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b626637b-4d54-4743-9717-142fe62e392b-catalog-content\") pod \"b626637b-4d54-4743-9717-142fe62e392b\" (UID: \"b626637b-4d54-4743-9717-142fe62e392b\") " Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.160133 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r5tz\" (UniqueName: \"kubernetes.io/projected/b626637b-4d54-4743-9717-142fe62e392b-kube-api-access-7r5tz\") pod \"b626637b-4d54-4743-9717-142fe62e392b\" (UID: \"b626637b-4d54-4743-9717-142fe62e392b\") " Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.160180 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b626637b-4d54-4743-9717-142fe62e392b-utilities\") pod \"b626637b-4d54-4743-9717-142fe62e392b\" (UID: \"b626637b-4d54-4743-9717-142fe62e392b\") " Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.163381 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b626637b-4d54-4743-9717-142fe62e392b-utilities" (OuterVolumeSpecName: "utilities") pod "b626637b-4d54-4743-9717-142fe62e392b" (UID: "b626637b-4d54-4743-9717-142fe62e392b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.168565 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b626637b-4d54-4743-9717-142fe62e392b-kube-api-access-7r5tz" (OuterVolumeSpecName: "kube-api-access-7r5tz") pod "b626637b-4d54-4743-9717-142fe62e392b" (UID: "b626637b-4d54-4743-9717-142fe62e392b"). InnerVolumeSpecName "kube-api-access-7r5tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.216004 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b626637b-4d54-4743-9717-142fe62e392b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b626637b-4d54-4743-9717-142fe62e392b" (UID: "b626637b-4d54-4743-9717-142fe62e392b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.261741 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r5tz\" (UniqueName: \"kubernetes.io/projected/b626637b-4d54-4743-9717-142fe62e392b-kube-api-access-7r5tz\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.261781 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b626637b-4d54-4743-9717-142fe62e392b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.261794 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b626637b-4d54-4743-9717-142fe62e392b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.384460 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.384514 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.433806 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.598465 4708 generic.go:334] "Generic (PLEG): container finished" podID="b626637b-4d54-4743-9717-142fe62e392b" containerID="f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd" exitCode=0 Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.598517 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmd2p" event={"ID":"b626637b-4d54-4743-9717-142fe62e392b","Type":"ContainerDied","Data":"f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd"} Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.598571 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmd2p" event={"ID":"b626637b-4d54-4743-9717-142fe62e392b","Type":"ContainerDied","Data":"200213f14b30740421ecd60e2c09f628bb75ea8cc90b5ef3dc9de6094b7cf587"} Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.598598 4708 scope.go:117] "RemoveContainer" containerID="f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.598588 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmd2p" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.598977 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5frmt" podUID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" containerName="registry-server" containerID="cri-o://cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111" gracePeriod=2 Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.624102 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pmd2p"] Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.627395 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pmd2p"] Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.624548 4708 scope.go:117] "RemoveContainer" containerID="8a413a15bc0a13e88feec4ccd3142c0287c5bc5f7717d843d72eff10c693483d" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.649777 4708 scope.go:117] "RemoveContainer" containerID="02511d17c83c93db7f2fc5999ae295f0d820ac98422957ce98283936f48d2790" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.655077 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.742655 4708 scope.go:117] "RemoveContainer" containerID="f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd" Mar 20 16:05:39 crc kubenswrapper[4708]: E0320 16:05:39.744291 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd\": container with ID starting with f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd not found: ID does not exist" containerID="f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.744342 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd"} err="failed to get container status \"f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd\": rpc error: code = NotFound desc = could not find container \"f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd\": container with ID starting with f6af933a4872a0390f5c93fc95bfc76488bc425864ecba1e381c8f51b88059fd not found: ID does not exist" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.744375 4708 scope.go:117] "RemoveContainer" containerID="8a413a15bc0a13e88feec4ccd3142c0287c5bc5f7717d843d72eff10c693483d" Mar 20 16:05:39 crc kubenswrapper[4708]: E0320 16:05:39.744704 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a413a15bc0a13e88feec4ccd3142c0287c5bc5f7717d843d72eff10c693483d\": container with ID starting with 8a413a15bc0a13e88feec4ccd3142c0287c5bc5f7717d843d72eff10c693483d not found: ID does not exist" containerID="8a413a15bc0a13e88feec4ccd3142c0287c5bc5f7717d843d72eff10c693483d" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.744764 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a413a15bc0a13e88feec4ccd3142c0287c5bc5f7717d843d72eff10c693483d"} err="failed to get container status \"8a413a15bc0a13e88feec4ccd3142c0287c5bc5f7717d843d72eff10c693483d\": rpc error: code = NotFound desc = could not find container \"8a413a15bc0a13e88feec4ccd3142c0287c5bc5f7717d843d72eff10c693483d\": container with ID starting with 8a413a15bc0a13e88feec4ccd3142c0287c5bc5f7717d843d72eff10c693483d not found: ID does not exist" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.744784 4708 scope.go:117] "RemoveContainer" containerID="02511d17c83c93db7f2fc5999ae295f0d820ac98422957ce98283936f48d2790" Mar 20 16:05:39 crc kubenswrapper[4708]: E0320 16:05:39.745027 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02511d17c83c93db7f2fc5999ae295f0d820ac98422957ce98283936f48d2790\": container with ID starting with 02511d17c83c93db7f2fc5999ae295f0d820ac98422957ce98283936f48d2790 not found: ID does not exist" containerID="02511d17c83c93db7f2fc5999ae295f0d820ac98422957ce98283936f48d2790" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.745048 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02511d17c83c93db7f2fc5999ae295f0d820ac98422957ce98283936f48d2790"} err="failed to get container status \"02511d17c83c93db7f2fc5999ae295f0d820ac98422957ce98283936f48d2790\": rpc error: code = NotFound desc = could not find container \"02511d17c83c93db7f2fc5999ae295f0d820ac98422957ce98283936f48d2790\": container with ID starting with 02511d17c83c93db7f2fc5999ae295f0d820ac98422957ce98283936f48d2790 not found: ID does not exist" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.818434 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:05:39 crc kubenswrapper[4708]: I0320 16:05:39.878124 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.070308 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.122950 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b626637b-4d54-4743-9717-142fe62e392b" path="/var/lib/kubelet/pods/b626637b-4d54-4743-9717-142fe62e392b/volumes" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.174100 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc1f7ff-9725-4049-b06c-50a4adfa1696-catalog-content\") pod \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\" (UID: \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\") " Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.174218 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc1f7ff-9725-4049-b06c-50a4adfa1696-utilities\") pod \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\" (UID: \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\") " Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.174284 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q884g\" (UniqueName: \"kubernetes.io/projected/cdc1f7ff-9725-4049-b06c-50a4adfa1696-kube-api-access-q884g\") pod \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\" (UID: \"cdc1f7ff-9725-4049-b06c-50a4adfa1696\") " Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.175385 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdc1f7ff-9725-4049-b06c-50a4adfa1696-utilities" (OuterVolumeSpecName: "utilities") pod "cdc1f7ff-9725-4049-b06c-50a4adfa1696" (UID: "cdc1f7ff-9725-4049-b06c-50a4adfa1696"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.179947 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc1f7ff-9725-4049-b06c-50a4adfa1696-kube-api-access-q884g" (OuterVolumeSpecName: "kube-api-access-q884g") pod "cdc1f7ff-9725-4049-b06c-50a4adfa1696" (UID: "cdc1f7ff-9725-4049-b06c-50a4adfa1696"). InnerVolumeSpecName "kube-api-access-q884g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.221027 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdc1f7ff-9725-4049-b06c-50a4adfa1696-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdc1f7ff-9725-4049-b06c-50a4adfa1696" (UID: "cdc1f7ff-9725-4049-b06c-50a4adfa1696"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.276178 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q884g\" (UniqueName: \"kubernetes.io/projected/cdc1f7ff-9725-4049-b06c-50a4adfa1696-kube-api-access-q884g\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.276215 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc1f7ff-9725-4049-b06c-50a4adfa1696-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.276226 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc1f7ff-9725-4049-b06c-50a4adfa1696-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.617419 4708 generic.go:334] "Generic (PLEG): container finished" podID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" containerID="cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111" exitCode=0 Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.617530 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5frmt" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.617509 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frmt" event={"ID":"cdc1f7ff-9725-4049-b06c-50a4adfa1696","Type":"ContainerDied","Data":"cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111"} Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.617601 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frmt" event={"ID":"cdc1f7ff-9725-4049-b06c-50a4adfa1696","Type":"ContainerDied","Data":"3dde5d1371fd794faf47f04e702da9c61ec0f78b103cc37bbeb2edadda21321b"} Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.617632 4708 scope.go:117] "RemoveContainer" containerID="cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.641632 4708 scope.go:117] "RemoveContainer" containerID="9418d2856bd412b98423cdb844e43ac8eb2e4cf1733792cba0da9a24bc63fba5" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.660762 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5frmt"] Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.664430 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5frmt"] Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.668182 4708 scope.go:117] "RemoveContainer" containerID="94e31968168b89b150e227267b7a7d71fc47007422c9b1e242d8e8a44f723923" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.699073 4708 scope.go:117] "RemoveContainer" containerID="cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111" Mar 20 16:05:40 crc kubenswrapper[4708]: E0320 16:05:40.699953 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111\": container with ID starting with cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111 not found: ID does not exist" containerID="cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.700055 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111"} err="failed to get container status \"cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111\": rpc error: code = NotFound desc = could not find container \"cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111\": container with ID starting with cb5fef1309b9c0d7ed0c097cb7ae68fb3bfbd9cf2d6cc9a6ee5b50e22a20f111 not found: ID does not exist" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.700122 4708 scope.go:117] "RemoveContainer" containerID="9418d2856bd412b98423cdb844e43ac8eb2e4cf1733792cba0da9a24bc63fba5" Mar 20 16:05:40 crc kubenswrapper[4708]: E0320 16:05:40.701074 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9418d2856bd412b98423cdb844e43ac8eb2e4cf1733792cba0da9a24bc63fba5\": container with ID starting with 9418d2856bd412b98423cdb844e43ac8eb2e4cf1733792cba0da9a24bc63fba5 not found: ID does not exist" containerID="9418d2856bd412b98423cdb844e43ac8eb2e4cf1733792cba0da9a24bc63fba5" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.701134 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9418d2856bd412b98423cdb844e43ac8eb2e4cf1733792cba0da9a24bc63fba5"} err="failed to get container status \"9418d2856bd412b98423cdb844e43ac8eb2e4cf1733792cba0da9a24bc63fba5\": rpc error: code = NotFound desc = could not find container \"9418d2856bd412b98423cdb844e43ac8eb2e4cf1733792cba0da9a24bc63fba5\": container with ID starting with 9418d2856bd412b98423cdb844e43ac8eb2e4cf1733792cba0da9a24bc63fba5 not found: ID does not exist" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.701176 4708 scope.go:117] "RemoveContainer" containerID="94e31968168b89b150e227267b7a7d71fc47007422c9b1e242d8e8a44f723923" Mar 20 16:05:40 crc kubenswrapper[4708]: E0320 16:05:40.701748 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e31968168b89b150e227267b7a7d71fc47007422c9b1e242d8e8a44f723923\": container with ID starting with 94e31968168b89b150e227267b7a7d71fc47007422c9b1e242d8e8a44f723923 not found: ID does not exist" containerID="94e31968168b89b150e227267b7a7d71fc47007422c9b1e242d8e8a44f723923" Mar 20 16:05:40 crc kubenswrapper[4708]: I0320 16:05:40.701776 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e31968168b89b150e227267b7a7d71fc47007422c9b1e242d8e8a44f723923"} err="failed to get container status \"94e31968168b89b150e227267b7a7d71fc47007422c9b1e242d8e8a44f723923\": rpc error: code = NotFound desc = could not find container \"94e31968168b89b150e227267b7a7d71fc47007422c9b1e242d8e8a44f723923\": container with ID starting with 94e31968168b89b150e227267b7a7d71fc47007422c9b1e242d8e8a44f723923 not found: ID does not exist" Mar 20 16:05:41 crc kubenswrapper[4708]: I0320 16:05:41.518459 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bc8477c76-2cm2t"] Mar 20 16:05:41 crc kubenswrapper[4708]: I0320 16:05:41.519005 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" podUID="140fa064-5588-4720-b6d8-2caf85fdf0ee" containerName="controller-manager" containerID="cri-o://8b92e066605900d7bbe2d61ad2622e58079076e0dee57c2f0fba50a1d28e3a8d" gracePeriod=30 Mar 20 16:05:41 crc kubenswrapper[4708]: I0320 16:05:41.617702 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d"] Mar 20 16:05:41 crc kubenswrapper[4708]: I0320 16:05:41.619007 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" podUID="9719359f-a579-4083-b292-f0ed2a00dbb4" containerName="route-controller-manager" containerID="cri-o://7e31b48abc1a043b5a857f37a8913f2ff6472e3eb1736930ac330a8ed2311155" gracePeriod=30 Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.123075 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" path="/var/lib/kubelet/pods/cdc1f7ff-9725-4049-b06c-50a4adfa1696/volumes" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.632389 4708 generic.go:334] "Generic (PLEG): container finished" podID="140fa064-5588-4720-b6d8-2caf85fdf0ee" containerID="8b92e066605900d7bbe2d61ad2622e58079076e0dee57c2f0fba50a1d28e3a8d" exitCode=0 Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.632467 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" event={"ID":"140fa064-5588-4720-b6d8-2caf85fdf0ee","Type":"ContainerDied","Data":"8b92e066605900d7bbe2d61ad2622e58079076e0dee57c2f0fba50a1d28e3a8d"} Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.634942 4708 generic.go:334] "Generic (PLEG): container finished" podID="9719359f-a579-4083-b292-f0ed2a00dbb4" containerID="7e31b48abc1a043b5a857f37a8913f2ff6472e3eb1736930ac330a8ed2311155" exitCode=0 Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.635006 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" event={"ID":"9719359f-a579-4083-b292-f0ed2a00dbb4","Type":"ContainerDied","Data":"7e31b48abc1a043b5a857f37a8913f2ff6472e3eb1736930ac330a8ed2311155"} Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.861702 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894130 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v"] Mar 20 16:05:42 crc kubenswrapper[4708]: E0320 16:05:42.894323 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b626637b-4d54-4743-9717-142fe62e392b" containerName="extract-utilities" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894335 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b626637b-4d54-4743-9717-142fe62e392b" containerName="extract-utilities" Mar 20 16:05:42 crc kubenswrapper[4708]: E0320 16:05:42.894344 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a193fd-f6c4-4779-ba32-d746f05094c1" containerName="extract-utilities" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894350 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a193fd-f6c4-4779-ba32-d746f05094c1" containerName="extract-utilities" Mar 20 16:05:42 crc kubenswrapper[4708]: E0320 16:05:42.894360 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a193fd-f6c4-4779-ba32-d746f05094c1" containerName="extract-content" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894366 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a193fd-f6c4-4779-ba32-d746f05094c1" containerName="extract-content" Mar 20 16:05:42 crc kubenswrapper[4708]: E0320 16:05:42.894376 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" containerName="extract-content" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894382 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" containerName="extract-content" Mar 20 16:05:42 crc kubenswrapper[4708]: E0320 16:05:42.894390 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b626637b-4d54-4743-9717-142fe62e392b" containerName="registry-server" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894395 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b626637b-4d54-4743-9717-142fe62e392b" containerName="registry-server" Mar 20 16:05:42 crc kubenswrapper[4708]: E0320 16:05:42.894403 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a193fd-f6c4-4779-ba32-d746f05094c1" containerName="registry-server" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894409 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a193fd-f6c4-4779-ba32-d746f05094c1" containerName="registry-server" Mar 20 16:05:42 crc kubenswrapper[4708]: E0320 16:05:42.894416 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" containerName="extract-utilities" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894422 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" containerName="extract-utilities" Mar 20 16:05:42 crc kubenswrapper[4708]: E0320 16:05:42.894431 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" containerName="registry-server" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894438 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" containerName="registry-server" Mar 20 16:05:42 crc kubenswrapper[4708]: E0320 16:05:42.894447 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9719359f-a579-4083-b292-f0ed2a00dbb4" containerName="route-controller-manager" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894453 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="9719359f-a579-4083-b292-f0ed2a00dbb4" containerName="route-controller-manager" Mar 20 16:05:42 crc kubenswrapper[4708]: E0320 16:05:42.894463 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b626637b-4d54-4743-9717-142fe62e392b" containerName="extract-content" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894468 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b626637b-4d54-4743-9717-142fe62e392b" containerName="extract-content" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894550 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a193fd-f6c4-4779-ba32-d746f05094c1" containerName="registry-server" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894564 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc1f7ff-9725-4049-b06c-50a4adfa1696" containerName="registry-server" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894572 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="b626637b-4d54-4743-9717-142fe62e392b" containerName="registry-server" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894580 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="9719359f-a579-4083-b292-f0ed2a00dbb4" containerName="route-controller-manager" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.894928 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.912925 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v"] Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.913281 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9719359f-a579-4083-b292-f0ed2a00dbb4-client-ca\") pod \"9719359f-a579-4083-b292-f0ed2a00dbb4\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.913371 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9719359f-a579-4083-b292-f0ed2a00dbb4-serving-cert\") pod \"9719359f-a579-4083-b292-f0ed2a00dbb4\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.913396 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9719359f-a579-4083-b292-f0ed2a00dbb4-config\") pod \"9719359f-a579-4083-b292-f0ed2a00dbb4\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.913443 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6jv2\" (UniqueName: \"kubernetes.io/projected/9719359f-a579-4083-b292-f0ed2a00dbb4-kube-api-access-j6jv2\") pod \"9719359f-a579-4083-b292-f0ed2a00dbb4\" (UID: \"9719359f-a579-4083-b292-f0ed2a00dbb4\") " Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.915390 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9719359f-a579-4083-b292-f0ed2a00dbb4-client-ca" (OuterVolumeSpecName: "client-ca") pod "9719359f-a579-4083-b292-f0ed2a00dbb4" (UID: "9719359f-a579-4083-b292-f0ed2a00dbb4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.915475 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9719359f-a579-4083-b292-f0ed2a00dbb4-config" (OuterVolumeSpecName: "config") pod "9719359f-a579-4083-b292-f0ed2a00dbb4" (UID: "9719359f-a579-4083-b292-f0ed2a00dbb4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.919572 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9719359f-a579-4083-b292-f0ed2a00dbb4-kube-api-access-j6jv2" (OuterVolumeSpecName: "kube-api-access-j6jv2") pod "9719359f-a579-4083-b292-f0ed2a00dbb4" (UID: "9719359f-a579-4083-b292-f0ed2a00dbb4"). InnerVolumeSpecName "kube-api-access-j6jv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:42 crc kubenswrapper[4708]: I0320 16:05:42.920831 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9719359f-a579-4083-b292-f0ed2a00dbb4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9719359f-a579-4083-b292-f0ed2a00dbb4" (UID: "9719359f-a579-4083-b292-f0ed2a00dbb4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.016411 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg8gg\" (UniqueName: \"kubernetes.io/projected/eec8cee6-e26a-4b5c-b171-121a3f8b0547-kube-api-access-fg8gg\") pod \"route-controller-manager-7f76f8945d-f6l7v\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.016823 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec8cee6-e26a-4b5c-b171-121a3f8b0547-config\") pod \"route-controller-manager-7f76f8945d-f6l7v\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.016844 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eec8cee6-e26a-4b5c-b171-121a3f8b0547-client-ca\") pod \"route-controller-manager-7f76f8945d-f6l7v\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.016894 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eec8cee6-e26a-4b5c-b171-121a3f8b0547-serving-cert\") pod \"route-controller-manager-7f76f8945d-f6l7v\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.016944 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9719359f-a579-4083-b292-f0ed2a00dbb4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.016955 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9719359f-a579-4083-b292-f0ed2a00dbb4-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.016965 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6jv2\" (UniqueName: \"kubernetes.io/projected/9719359f-a579-4083-b292-f0ed2a00dbb4-kube-api-access-j6jv2\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.016973 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9719359f-a579-4083-b292-f0ed2a00dbb4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.118167 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eec8cee6-e26a-4b5c-b171-121a3f8b0547-serving-cert\") pod \"route-controller-manager-7f76f8945d-f6l7v\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.118384 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg8gg\" (UniqueName: \"kubernetes.io/projected/eec8cee6-e26a-4b5c-b171-121a3f8b0547-kube-api-access-fg8gg\") pod \"route-controller-manager-7f76f8945d-f6l7v\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.119025 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec8cee6-e26a-4b5c-b171-121a3f8b0547-config\") pod \"route-controller-manager-7f76f8945d-f6l7v\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.119081 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eec8cee6-e26a-4b5c-b171-121a3f8b0547-client-ca\") pod \"route-controller-manager-7f76f8945d-f6l7v\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.120793 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eec8cee6-e26a-4b5c-b171-121a3f8b0547-client-ca\") pod \"route-controller-manager-7f76f8945d-f6l7v\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.121276 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec8cee6-e26a-4b5c-b171-121a3f8b0547-config\") pod \"route-controller-manager-7f76f8945d-f6l7v\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.138806 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eec8cee6-e26a-4b5c-b171-121a3f8b0547-serving-cert\") pod \"route-controller-manager-7f76f8945d-f6l7v\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.148094 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg8gg\" (UniqueName: \"kubernetes.io/projected/eec8cee6-e26a-4b5c-b171-121a3f8b0547-kube-api-access-fg8gg\") pod \"route-controller-manager-7f76f8945d-f6l7v\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.210439 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.222112 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.329362 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-proxy-ca-bundles\") pod \"140fa064-5588-4720-b6d8-2caf85fdf0ee\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.329442 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x92js\" (UniqueName: \"kubernetes.io/projected/140fa064-5588-4720-b6d8-2caf85fdf0ee-kube-api-access-x92js\") pod \"140fa064-5588-4720-b6d8-2caf85fdf0ee\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.329512 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/140fa064-5588-4720-b6d8-2caf85fdf0ee-serving-cert\") pod \"140fa064-5588-4720-b6d8-2caf85fdf0ee\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.329605 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-config\") pod \"140fa064-5588-4720-b6d8-2caf85fdf0ee\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.329633 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-client-ca\") pod \"140fa064-5588-4720-b6d8-2caf85fdf0ee\" (UID: \"140fa064-5588-4720-b6d8-2caf85fdf0ee\") " Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.330643 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-client-ca" (OuterVolumeSpecName: "client-ca") pod "140fa064-5588-4720-b6d8-2caf85fdf0ee" (UID: "140fa064-5588-4720-b6d8-2caf85fdf0ee"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.331383 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-config" (OuterVolumeSpecName: "config") pod "140fa064-5588-4720-b6d8-2caf85fdf0ee" (UID: "140fa064-5588-4720-b6d8-2caf85fdf0ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.331413 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "140fa064-5588-4720-b6d8-2caf85fdf0ee" (UID: "140fa064-5588-4720-b6d8-2caf85fdf0ee"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.334065 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/140fa064-5588-4720-b6d8-2caf85fdf0ee-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "140fa064-5588-4720-b6d8-2caf85fdf0ee" (UID: "140fa064-5588-4720-b6d8-2caf85fdf0ee"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.334332 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/140fa064-5588-4720-b6d8-2caf85fdf0ee-kube-api-access-x92js" (OuterVolumeSpecName: "kube-api-access-x92js") pod "140fa064-5588-4720-b6d8-2caf85fdf0ee" (UID: "140fa064-5588-4720-b6d8-2caf85fdf0ee"). InnerVolumeSpecName "kube-api-access-x92js". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.347873 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mr9ct"] Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.348093 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mr9ct" podUID="18d7f096-fc87-4f68-959e-5ab803b7e097" containerName="registry-server" containerID="cri-o://c0893ce7d1b0d3399d4c15fbd8d9b1b0c2d4a5713addd02e7cc06144794961d3" gracePeriod=2 Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.431268 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x92js\" (UniqueName: \"kubernetes.io/projected/140fa064-5588-4720-b6d8-2caf85fdf0ee-kube-api-access-x92js\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.431338 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/140fa064-5588-4720-b6d8-2caf85fdf0ee-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.431372 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.431397 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.431421 4708 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/140fa064-5588-4720-b6d8-2caf85fdf0ee-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.446479 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v"] Mar 20 16:05:43 crc kubenswrapper[4708]: W0320 16:05:43.465982 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeec8cee6_e26a_4b5c_b171_121a3f8b0547.slice/crio-65d20c9b612a438d441caf4dd99c6341bcce001c78bb7a4bdeef3d38eea8276a WatchSource:0}: Error finding container 65d20c9b612a438d441caf4dd99c6341bcce001c78bb7a4bdeef3d38eea8276a: Status 404 returned error can't find the container with id 65d20c9b612a438d441caf4dd99c6341bcce001c78bb7a4bdeef3d38eea8276a Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.644034 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" event={"ID":"eec8cee6-e26a-4b5c-b171-121a3f8b0547","Type":"ContainerStarted","Data":"65d20c9b612a438d441caf4dd99c6341bcce001c78bb7a4bdeef3d38eea8276a"} Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.646272 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" event={"ID":"9719359f-a579-4083-b292-f0ed2a00dbb4","Type":"ContainerDied","Data":"133942e6fdfb2dd120fbe40eabdc867d241d149bb7b1c445b2b2e1fd01831544"} Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.646337 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.646365 4708 scope.go:117] "RemoveContainer" containerID="7e31b48abc1a043b5a857f37a8913f2ff6472e3eb1736930ac330a8ed2311155" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.650570 4708 generic.go:334] "Generic (PLEG): container finished" podID="18d7f096-fc87-4f68-959e-5ab803b7e097" containerID="c0893ce7d1b0d3399d4c15fbd8d9b1b0c2d4a5713addd02e7cc06144794961d3" exitCode=0 Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.650644 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr9ct" event={"ID":"18d7f096-fc87-4f68-959e-5ab803b7e097","Type":"ContainerDied","Data":"c0893ce7d1b0d3399d4c15fbd8d9b1b0c2d4a5713addd02e7cc06144794961d3"} Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.653193 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" event={"ID":"140fa064-5588-4720-b6d8-2caf85fdf0ee","Type":"ContainerDied","Data":"8902d3c68998b275d8a10e8895e4491099eaef8af634d9de1b42cdd504048ebe"} Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.653266 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bc8477c76-2cm2t" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.667058 4708 scope.go:117] "RemoveContainer" containerID="8b92e066605900d7bbe2d61ad2622e58079076e0dee57c2f0fba50a1d28e3a8d" Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.704623 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d"] Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.710032 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b95f64d8-zr55d"] Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.715050 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bc8477c76-2cm2t"] Mar 20 16:05:43 crc kubenswrapper[4708]: I0320 16:05:43.717549 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-bc8477c76-2cm2t"] Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.121904 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="140fa064-5588-4720-b6d8-2caf85fdf0ee" path="/var/lib/kubelet/pods/140fa064-5588-4720-b6d8-2caf85fdf0ee/volumes" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.123001 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9719359f-a579-4083-b292-f0ed2a00dbb4" path="/var/lib/kubelet/pods/9719359f-a579-4083-b292-f0ed2a00dbb4/volumes" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.322735 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.446367 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18d7f096-fc87-4f68-959e-5ab803b7e097-catalog-content\") pod \"18d7f096-fc87-4f68-959e-5ab803b7e097\" (UID: \"18d7f096-fc87-4f68-959e-5ab803b7e097\") " Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.446420 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvqzm\" (UniqueName: \"kubernetes.io/projected/18d7f096-fc87-4f68-959e-5ab803b7e097-kube-api-access-xvqzm\") pod \"18d7f096-fc87-4f68-959e-5ab803b7e097\" (UID: \"18d7f096-fc87-4f68-959e-5ab803b7e097\") " Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.446453 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18d7f096-fc87-4f68-959e-5ab803b7e097-utilities\") pod \"18d7f096-fc87-4f68-959e-5ab803b7e097\" (UID: \"18d7f096-fc87-4f68-959e-5ab803b7e097\") " Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.447351 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d7f096-fc87-4f68-959e-5ab803b7e097-utilities" (OuterVolumeSpecName: "utilities") pod "18d7f096-fc87-4f68-959e-5ab803b7e097" (UID: "18d7f096-fc87-4f68-959e-5ab803b7e097"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.453561 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d7f096-fc87-4f68-959e-5ab803b7e097-kube-api-access-xvqzm" (OuterVolumeSpecName: "kube-api-access-xvqzm") pod "18d7f096-fc87-4f68-959e-5ab803b7e097" (UID: "18d7f096-fc87-4f68-959e-5ab803b7e097"). InnerVolumeSpecName "kube-api-access-xvqzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.548519 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvqzm\" (UniqueName: \"kubernetes.io/projected/18d7f096-fc87-4f68-959e-5ab803b7e097-kube-api-access-xvqzm\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.549119 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18d7f096-fc87-4f68-959e-5ab803b7e097-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.571435 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d7f096-fc87-4f68-959e-5ab803b7e097-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18d7f096-fc87-4f68-959e-5ab803b7e097" (UID: "18d7f096-fc87-4f68-959e-5ab803b7e097"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.650916 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18d7f096-fc87-4f68-959e-5ab803b7e097-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.662053 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" event={"ID":"eec8cee6-e26a-4b5c-b171-121a3f8b0547","Type":"ContainerStarted","Data":"147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5"} Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.663028 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.671016 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr9ct" event={"ID":"18d7f096-fc87-4f68-959e-5ab803b7e097","Type":"ContainerDied","Data":"4bdeba6928915fff4946c25abc323c91a1898e0a01cef506e43aeebf92843cc0"} Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.671079 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr9ct" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.671420 4708 scope.go:117] "RemoveContainer" containerID="c0893ce7d1b0d3399d4c15fbd8d9b1b0c2d4a5713addd02e7cc06144794961d3" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.678088 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.689242 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" podStartSLOduration=3.689219075 podStartE2EDuration="3.689219075s" podCreationTimestamp="2026-03-20 16:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:44.686410865 +0000 UTC m=+299.360747590" watchObservedRunningTime="2026-03-20 16:05:44.689219075 +0000 UTC m=+299.363555790" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.701835 4708 scope.go:117] "RemoveContainer" containerID="5e055af0e94e4ef34fff89b9b072c5576d23cc1c554f71a4cadadb9eaaf946e6" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.730044 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mr9ct"] Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.741631 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mr9ct"] Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.744460 4708 scope.go:117] "RemoveContainer" containerID="2faf8b9350c6b87e2af4b910a38a655377f744285ddc5e834e918f65f0bdff72" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.978622 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75d8c46c45-qtxrd"] Mar 20 16:05:44 crc kubenswrapper[4708]: E0320 16:05:44.978908 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d7f096-fc87-4f68-959e-5ab803b7e097" containerName="registry-server" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.978927 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d7f096-fc87-4f68-959e-5ab803b7e097" containerName="registry-server" Mar 20 16:05:44 crc kubenswrapper[4708]: E0320 16:05:44.978951 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d7f096-fc87-4f68-959e-5ab803b7e097" containerName="extract-content" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.978960 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d7f096-fc87-4f68-959e-5ab803b7e097" containerName="extract-content" Mar 20 16:05:44 crc kubenswrapper[4708]: E0320 16:05:44.978974 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140fa064-5588-4720-b6d8-2caf85fdf0ee" containerName="controller-manager" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.978982 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="140fa064-5588-4720-b6d8-2caf85fdf0ee" containerName="controller-manager" Mar 20 16:05:44 crc kubenswrapper[4708]: E0320 16:05:44.978995 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d7f096-fc87-4f68-959e-5ab803b7e097" containerName="extract-utilities" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.979002 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d7f096-fc87-4f68-959e-5ab803b7e097" containerName="extract-utilities" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.979097 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d7f096-fc87-4f68-959e-5ab803b7e097" containerName="registry-server" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.979116 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="140fa064-5588-4720-b6d8-2caf85fdf0ee" containerName="controller-manager" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.979534 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.982210 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.982429 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.983350 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.983539 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.983815 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.983971 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.992115 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75d8c46c45-qtxrd"] Mar 20 16:05:44 crc kubenswrapper[4708]: I0320 16:05:44.992330 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.056718 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-client-ca\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.056793 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-proxy-ca-bundles\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.057069 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-serving-cert\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.057355 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4qkp\" (UniqueName: \"kubernetes.io/projected/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-kube-api-access-l4qkp\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.057451 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-config\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.159356 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4qkp\" (UniqueName: \"kubernetes.io/projected/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-kube-api-access-l4qkp\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.159442 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-config\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.159477 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-client-ca\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.159523 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-proxy-ca-bundles\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.159601 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-serving-cert\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.160821 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-client-ca\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.160836 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-proxy-ca-bundles\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.161020 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-config\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.165702 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-serving-cert\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.177027 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4qkp\" (UniqueName: \"kubernetes.io/projected/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-kube-api-access-l4qkp\") pod \"controller-manager-75d8c46c45-qtxrd\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.295876 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.549213 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75d8c46c45-qtxrd"] Mar 20 16:05:45 crc kubenswrapper[4708]: W0320 16:05:45.555705 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8485bd73_9fba_4cbd_8438_c43bd02d4dfa.slice/crio-a29f50e989980df66de3cfe87558ee70cf0d510d03e6394088966b3879272bf7 WatchSource:0}: Error finding container a29f50e989980df66de3cfe87558ee70cf0d510d03e6394088966b3879272bf7: Status 404 returned error can't find the container with id a29f50e989980df66de3cfe87558ee70cf0d510d03e6394088966b3879272bf7 Mar 20 16:05:45 crc kubenswrapper[4708]: I0320 16:05:45.686497 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" event={"ID":"8485bd73-9fba-4cbd-8438-c43bd02d4dfa","Type":"ContainerStarted","Data":"a29f50e989980df66de3cfe87558ee70cf0d510d03e6394088966b3879272bf7"} Mar 20 16:05:46 crc kubenswrapper[4708]: I0320 16:05:46.120335 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d7f096-fc87-4f68-959e-5ab803b7e097" path="/var/lib/kubelet/pods/18d7f096-fc87-4f68-959e-5ab803b7e097/volumes" Mar 20 16:05:46 crc kubenswrapper[4708]: I0320 16:05:46.695154 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" event={"ID":"8485bd73-9fba-4cbd-8438-c43bd02d4dfa","Type":"ContainerStarted","Data":"8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9"} Mar 20 16:05:46 crc kubenswrapper[4708]: I0320 16:05:46.695627 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:46 crc kubenswrapper[4708]: I0320 16:05:46.702875 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:05:46 crc kubenswrapper[4708]: I0320 16:05:46.741423 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" podStartSLOduration=5.741396054 podStartE2EDuration="5.741396054s" podCreationTimestamp="2026-03-20 16:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:05:46.717420066 +0000 UTC m=+301.391756791" watchObservedRunningTime="2026-03-20 16:05:46.741396054 +0000 UTC m=+301.415732779" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.949012 4708 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.952064 4708 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.952231 4708 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.952233 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.952472 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171" gracePeriod=15 Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.952548 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395" gracePeriod=15 Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.952564 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789" gracePeriod=15 Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.952651 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa" gracePeriod=15 Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.952642 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c" gracePeriod=15 Mar 20 16:05:48 crc kubenswrapper[4708]: E0320 16:05:48.953025 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953051 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: E0320 16:05:48.953064 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953071 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: E0320 16:05:48.953082 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953089 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: E0320 16:05:48.953096 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953105 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 16:05:48 crc kubenswrapper[4708]: E0320 16:05:48.953227 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953239 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: E0320 16:05:48.953249 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953257 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 16:05:48 crc kubenswrapper[4708]: E0320 16:05:48.953268 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953274 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 16:05:48 crc kubenswrapper[4708]: E0320 16:05:48.953285 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953294 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 16:05:48 crc kubenswrapper[4708]: E0320 16:05:48.953314 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953323 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953444 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953455 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953467 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953476 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953486 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953496 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953509 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 16:05:48 crc kubenswrapper[4708]: E0320 16:05:48.953650 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953662 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.953890 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.954102 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.958538 4708 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 20 16:05:48 crc kubenswrapper[4708]: I0320 16:05:48.994479 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.042756 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.042819 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.042878 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.042931 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.042981 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.043015 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.043039 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.043063 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.144526 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145102 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145144 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145180 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145227 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145242 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145277 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145337 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145365 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145410 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145442 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145462 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145495 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145531 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.144727 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.145641 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.294808 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:05:49 crc kubenswrapper[4708]: W0320 16:05:49.315759 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-a727c956ae0666e0cb318d4cc9e52bb1bf1c1e1fd11989f536dcef483b71e7fd WatchSource:0}: Error finding container a727c956ae0666e0cb318d4cc9e52bb1bf1c1e1fd11989f536dcef483b71e7fd: Status 404 returned error can't find the container with id a727c956ae0666e0cb318d4cc9e52bb1bf1c1e1fd11989f536dcef483b71e7fd Mar 20 16:05:49 crc kubenswrapper[4708]: E0320 16:05:49.319344 4708 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.4:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e984a9496a406 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:05:49.318456326 +0000 UTC m=+303.992793051,LastTimestamp:2026-03-20 16:05:49.318456326 +0000 UTC m=+303.992793051,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:05:49 crc kubenswrapper[4708]: E0320 16:05:49.693816 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:05:49Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:05:49Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:05:49Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:05:49Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:ce2b0bfbec08802afec185b6eebec59a5a016291cad2c3515b0d06af6c34fde3\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:d5656ec3f5691d96c2da35350810e5bd700559213851b28fc3523a059efce76f\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252734685},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:422eef49f9b56aaa481c870199db8b853dba5d36f00adcc19d22a6960345f1cc\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:667c2f632cea73b8b5354e1fbd365169f285c9b8460c5e81f63967a72b8f90e8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223676630},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:49 crc kubenswrapper[4708]: E0320 16:05:49.694607 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:49 crc kubenswrapper[4708]: E0320 16:05:49.694863 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:49 crc kubenswrapper[4708]: E0320 16:05:49.695123 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:49 crc kubenswrapper[4708]: E0320 16:05:49.695319 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:49 crc kubenswrapper[4708]: E0320 16:05:49.695414 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.713171 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"80073094dbd2447d70ee5eefbd5dc0dea9d3f5da0a31af09f495c9557bf81898"} Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.713219 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a727c956ae0666e0cb318d4cc9e52bb1bf1c1e1fd11989f536dcef483b71e7fd"} Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.714508 4708 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.719908 4708 generic.go:334] "Generic (PLEG): container finished" podID="ebff241e-ba92-403f-991f-82c78be60f16" containerID="819ce2cd55ffebd0c67c059a21d116a16fa4b0853e0e2fbc5d1d0eacb6c0b208" exitCode=0 Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.720000 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ebff241e-ba92-403f-991f-82c78be60f16","Type":"ContainerDied","Data":"819ce2cd55ffebd0c67c059a21d116a16fa4b0853e0e2fbc5d1d0eacb6c0b208"} Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.720612 4708 status_manager.go:851] "Failed to get status for pod" podUID="ebff241e-ba92-403f-991f-82c78be60f16" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.720808 4708 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.723595 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.725294 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.726543 4708 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395" exitCode=0 Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.726565 4708 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789" exitCode=0 Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.726574 4708 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa" exitCode=0 Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.726584 4708 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c" exitCode=2 Mar 20 16:05:49 crc kubenswrapper[4708]: I0320 16:05:49.726622 4708 scope.go:117] "RemoveContainer" containerID="8c7a6d46cdd42906aa14e0c45967985c5775a8af156926b7a66cf48d2d3ea670" Mar 20 16:05:50 crc kubenswrapper[4708]: I0320 16:05:50.745062 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.315395 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.316444 4708 status_manager.go:851] "Failed to get status for pod" podUID="ebff241e-ba92-403f-991f-82c78be60f16" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.316957 4708 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.323565 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.324946 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.325362 4708 status_manager.go:851] "Failed to get status for pod" podUID="ebff241e-ba92-403f-991f-82c78be60f16" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.325537 4708 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.325802 4708 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.479954 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480012 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ebff241e-ba92-403f-991f-82c78be60f16-kubelet-dir\") pod \"ebff241e-ba92-403f-991f-82c78be60f16\" (UID: \"ebff241e-ba92-403f-991f-82c78be60f16\") " Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480080 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebff241e-ba92-403f-991f-82c78be60f16-kube-api-access\") pod \"ebff241e-ba92-403f-991f-82c78be60f16\" (UID: \"ebff241e-ba92-403f-991f-82c78be60f16\") " Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480112 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebff241e-ba92-403f-991f-82c78be60f16-var-lock\") pod \"ebff241e-ba92-403f-991f-82c78be60f16\" (UID: \"ebff241e-ba92-403f-991f-82c78be60f16\") " Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480091 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480157 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480130 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480197 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebff241e-ba92-403f-991f-82c78be60f16-var-lock" (OuterVolumeSpecName: "var-lock") pod "ebff241e-ba92-403f-991f-82c78be60f16" (UID: "ebff241e-ba92-403f-991f-82c78be60f16"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480237 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebff241e-ba92-403f-991f-82c78be60f16-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ebff241e-ba92-403f-991f-82c78be60f16" (UID: "ebff241e-ba92-403f-991f-82c78be60f16"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480240 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480271 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480550 4708 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebff241e-ba92-403f-991f-82c78be60f16-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480574 4708 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480582 4708 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480591 4708 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.480599 4708 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ebff241e-ba92-403f-991f-82c78be60f16-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.489478 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebff241e-ba92-403f-991f-82c78be60f16-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ebff241e-ba92-403f-991f-82c78be60f16" (UID: "ebff241e-ba92-403f-991f-82c78be60f16"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.583034 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebff241e-ba92-403f-991f-82c78be60f16-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.754064 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ebff241e-ba92-403f-991f-82c78be60f16","Type":"ContainerDied","Data":"06c7b359940d530d1b2c71c3de3dc0dcd2cb17da503f885a1606c85fa4bf12b2"} Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.754108 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06c7b359940d530d1b2c71c3de3dc0dcd2cb17da503f885a1606c85fa4bf12b2" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.754114 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.757905 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.758931 4708 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171" exitCode=0 Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.759046 4708 scope.go:117] "RemoveContainer" containerID="4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.759138 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.765891 4708 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.766479 4708 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.767062 4708 status_manager.go:851] "Failed to get status for pod" podUID="ebff241e-ba92-403f-991f-82c78be60f16" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.777621 4708 scope.go:117] "RemoveContainer" containerID="5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.785473 4708 status_manager.go:851] "Failed to get status for pod" podUID="ebff241e-ba92-403f-991f-82c78be60f16" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.786115 4708 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.786697 4708 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.794217 4708 scope.go:117] "RemoveContainer" containerID="b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.810481 4708 scope.go:117] "RemoveContainer" containerID="c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.828444 4708 scope.go:117] "RemoveContainer" containerID="708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.845371 4708 scope.go:117] "RemoveContainer" containerID="c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.862272 4708 scope.go:117] "RemoveContainer" containerID="4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395" Mar 20 16:05:51 crc kubenswrapper[4708]: E0320 16:05:51.862779 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\": container with ID starting with 4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395 not found: ID does not exist" containerID="4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.862839 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395"} err="failed to get container status \"4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\": rpc error: code = NotFound desc = could not find container \"4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395\": container with ID starting with 4a1516e84cef52ef6069a4fc09dd9e3cfe8ce0d132b7351fcde63337e3aa9395 not found: ID does not exist" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.862886 4708 scope.go:117] "RemoveContainer" containerID="5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789" Mar 20 16:05:51 crc kubenswrapper[4708]: E0320 16:05:51.863301 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\": container with ID starting with 5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789 not found: ID does not exist" containerID="5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.863338 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789"} err="failed to get container status \"5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\": rpc error: code = NotFound desc = could not find container \"5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789\": container with ID starting with 5540c3ca63f78373f5d500202b3df27b9cfad2d217f5b56bdcf28f3bf69c9789 not found: ID does not exist" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.863369 4708 scope.go:117] "RemoveContainer" containerID="b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa" Mar 20 16:05:51 crc kubenswrapper[4708]: E0320 16:05:51.863868 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\": container with ID starting with b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa not found: ID does not exist" containerID="b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.863925 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa"} err="failed to get container status \"b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\": rpc error: code = NotFound desc = could not find container \"b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa\": container with ID starting with b68376e4a61dc6f91b5b45c7d0c62702450278373dfc8ff02c2812ac158967fa not found: ID does not exist" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.863953 4708 scope.go:117] "RemoveContainer" containerID="c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c" Mar 20 16:05:51 crc kubenswrapper[4708]: E0320 16:05:51.864295 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\": container with ID starting with c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c not found: ID does not exist" containerID="c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.864320 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c"} err="failed to get container status \"c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\": rpc error: code = NotFound desc = could not find container \"c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c\": container with ID starting with c5828ae2d0608563c379e8cbdaf69dc7b69da7dd0c42cdee6e1419298f7bc54c not found: ID does not exist" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.864336 4708 scope.go:117] "RemoveContainer" containerID="708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171" Mar 20 16:05:51 crc kubenswrapper[4708]: E0320 16:05:51.864754 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\": container with ID starting with 708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171 not found: ID does not exist" containerID="708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.864805 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171"} err="failed to get container status \"708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\": rpc error: code = NotFound desc = could not find container \"708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171\": container with ID starting with 708c635ad8a836efbbca9e3aa78f19784ae48d066dc1731097fe9bd517fcf171 not found: ID does not exist" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.864828 4708 scope.go:117] "RemoveContainer" containerID="c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5" Mar 20 16:05:51 crc kubenswrapper[4708]: E0320 16:05:51.865610 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\": container with ID starting with c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5 not found: ID does not exist" containerID="c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5" Mar 20 16:05:51 crc kubenswrapper[4708]: I0320 16:05:51.865696 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5"} err="failed to get container status \"c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\": rpc error: code = NotFound desc = could not find container \"c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5\": container with ID starting with c0ad5750202943db395950d0aa188660646ddbfe0b9f238252773ca218dfdfe5 not found: ID does not exist" Mar 20 16:05:52 crc kubenswrapper[4708]: I0320 16:05:52.122341 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 16:05:55 crc kubenswrapper[4708]: E0320 16:05:55.397837 4708 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.4:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e984a9496a406 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 16:05:49.318456326 +0000 UTC m=+303.992793051,LastTimestamp:2026-03-20 16:05:49.318456326 +0000 UTC m=+303.992793051,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 16:05:56 crc kubenswrapper[4708]: I0320 16:05:56.117695 4708 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:56 crc kubenswrapper[4708]: I0320 16:05:56.118445 4708 status_manager.go:851] "Failed to get status for pod" podUID="ebff241e-ba92-403f-991f-82c78be60f16" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:58 crc kubenswrapper[4708]: E0320 16:05:58.205224 4708 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:58 crc kubenswrapper[4708]: E0320 16:05:58.206309 4708 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:58 crc kubenswrapper[4708]: E0320 16:05:58.206906 4708 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:58 crc kubenswrapper[4708]: E0320 16:05:58.207360 4708 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:58 crc kubenswrapper[4708]: E0320 16:05:58.207837 4708 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:58 crc kubenswrapper[4708]: I0320 16:05:58.207884 4708 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 16:05:58 crc kubenswrapper[4708]: E0320 16:05:58.208337 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" interval="200ms" Mar 20 16:05:58 crc kubenswrapper[4708]: E0320 16:05:58.408992 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" interval="400ms" Mar 20 16:05:58 crc kubenswrapper[4708]: E0320 16:05:58.809857 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" interval="800ms" Mar 20 16:05:59 crc kubenswrapper[4708]: E0320 16:05:59.611025 4708 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" interval="1.6s" Mar 20 16:05:59 crc kubenswrapper[4708]: E0320 16:05:59.823931 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:05:59Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:05:59Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:05:59Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T16:05:59Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:ce2b0bfbec08802afec185b6eebec59a5a016291cad2c3515b0d06af6c34fde3\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:d5656ec3f5691d96c2da35350810e5bd700559213851b28fc3523a059efce76f\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252734685},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:422eef49f9b56aaa481c870199db8b853dba5d36f00adcc19d22a6960345f1cc\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:667c2f632cea73b8b5354e1fbd365169f285c9b8460c5e81f63967a72b8f90e8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223676630},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:59 crc kubenswrapper[4708]: E0320 16:05:59.824482 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:59 crc kubenswrapper[4708]: E0320 16:05:59.824901 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:59 crc kubenswrapper[4708]: E0320 16:05:59.825295 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:59 crc kubenswrapper[4708]: E0320 16:05:59.825601 4708 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:05:59 crc kubenswrapper[4708]: E0320 16:05:59.825617 4708 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.110598 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.112731 4708 status_manager.go:851] "Failed to get status for pod" podUID="ebff241e-ba92-403f-991f-82c78be60f16" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.113557 4708 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.139810 4708 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46f2e587-1a2b-476f-aaf1-a95fec8e0434" Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.139848 4708 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46f2e587-1a2b-476f-aaf1-a95fec8e0434" Mar 20 16:06:00 crc kubenswrapper[4708]: E0320 16:06:00.140352 4708 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.141090 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.824943 4708 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="a417a4fa42d5e53f2cf52b84b0259553ab5c06eab718e54e0fb4e3941bfbb8ef" exitCode=0 Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.825028 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"a417a4fa42d5e53f2cf52b84b0259553ab5c06eab718e54e0fb4e3941bfbb8ef"} Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.825290 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"745a62a28cc866f23d653c7e3df7690e3af6c91e2c3350c2a43d6243f7f786d6"} Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.825611 4708 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46f2e587-1a2b-476f-aaf1-a95fec8e0434" Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.825629 4708 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46f2e587-1a2b-476f-aaf1-a95fec8e0434" Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.825867 4708 status_manager.go:851] "Failed to get status for pod" podUID="ebff241e-ba92-403f-991f-82c78be60f16" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:06:00 crc kubenswrapper[4708]: I0320 16:06:00.826033 4708 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" Mar 20 16:06:00 crc kubenswrapper[4708]: E0320 16:06:00.826529 4708 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.4:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:01 crc kubenswrapper[4708]: I0320 16:06:01.840825 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3841c6de0771ce4f626b1c46f808b0a224cb0472d38f1e91880bdff7003ca66f"} Mar 20 16:06:01 crc kubenswrapper[4708]: I0320 16:06:01.841116 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1cc002a37a2ab5c7805b11b71abcba540abfbb5807e06d3d65af64e16dc8107d"} Mar 20 16:06:01 crc kubenswrapper[4708]: I0320 16:06:01.841126 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"db8b605f35c4e065f5c8192e35d5e80a28132acbe4c93b9440b65dafed3741d2"} Mar 20 16:06:01 crc kubenswrapper[4708]: I0320 16:06:01.841135 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a0fbebb79a896a9c86add7f6effe80d87f748d1a44f3d61109d99c9b6b1ebeae"} Mar 20 16:06:02 crc kubenswrapper[4708]: I0320 16:06:02.848885 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e1583b0c6eee6d13fff02a26372a70c54f49c63f4464186d74144c59251e5a8b"} Mar 20 16:06:02 crc kubenswrapper[4708]: I0320 16:06:02.849269 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:02 crc kubenswrapper[4708]: I0320 16:06:02.849080 4708 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46f2e587-1a2b-476f-aaf1-a95fec8e0434" Mar 20 16:06:02 crc kubenswrapper[4708]: I0320 16:06:02.849331 4708 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46f2e587-1a2b-476f-aaf1-a95fec8e0434" Mar 20 16:06:02 crc kubenswrapper[4708]: I0320 16:06:02.852684 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 16:06:02 crc kubenswrapper[4708]: I0320 16:06:02.853471 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 16:06:02 crc kubenswrapper[4708]: I0320 16:06:02.853524 4708 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6" exitCode=1 Mar 20 16:06:02 crc kubenswrapper[4708]: I0320 16:06:02.853556 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6"} Mar 20 16:06:02 crc kubenswrapper[4708]: I0320 16:06:02.853906 4708 scope.go:117] "RemoveContainer" containerID="757d3bf1a5e15426f57f88013140f24aeed64ae22555f29107d68c73d05262f6" Mar 20 16:06:03 crc kubenswrapper[4708]: I0320 16:06:03.875875 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 16:06:03 crc kubenswrapper[4708]: I0320 16:06:03.876791 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 16:06:03 crc kubenswrapper[4708]: I0320 16:06:03.876878 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"982bb4266b1f9049ebf187caaa9de615c62b602e876a2484190c4075e4e4213d"} Mar 20 16:06:05 crc kubenswrapper[4708]: I0320 16:06:05.141453 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:05 crc kubenswrapper[4708]: I0320 16:06:05.141968 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:05 crc kubenswrapper[4708]: I0320 16:06:05.151222 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:07 crc kubenswrapper[4708]: I0320 16:06:07.863628 4708 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:07 crc kubenswrapper[4708]: I0320 16:06:07.904634 4708 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46f2e587-1a2b-476f-aaf1-a95fec8e0434" Mar 20 16:06:07 crc kubenswrapper[4708]: I0320 16:06:07.904697 4708 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46f2e587-1a2b-476f-aaf1-a95fec8e0434" Mar 20 16:06:07 crc kubenswrapper[4708]: I0320 16:06:07.915247 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:07 crc kubenswrapper[4708]: I0320 16:06:07.933455 4708 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f48e97a3-287a-4225-871b-45d0904bdc35" Mar 20 16:06:08 crc kubenswrapper[4708]: I0320 16:06:08.909226 4708 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46f2e587-1a2b-476f-aaf1-a95fec8e0434" Mar 20 16:06:08 crc kubenswrapper[4708]: I0320 16:06:08.909257 4708 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46f2e587-1a2b-476f-aaf1-a95fec8e0434" Mar 20 16:06:08 crc kubenswrapper[4708]: I0320 16:06:08.912136 4708 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f48e97a3-287a-4225-871b-45d0904bdc35" Mar 20 16:06:11 crc kubenswrapper[4708]: I0320 16:06:11.469226 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:06:11 crc kubenswrapper[4708]: I0320 16:06:11.475077 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:06:11 crc kubenswrapper[4708]: I0320 16:06:11.585137 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.233357 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.233735 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.233775 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.233802 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.236238 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.236442 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.236463 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.245576 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.246081 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.253117 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.260358 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.260563 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.520587 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.545959 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.562030 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:06:16 crc kubenswrapper[4708]: I0320 16:06:16.859212 4708 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 16:06:16 crc kubenswrapper[4708]: W0320 16:06:16.975658 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-6401376416a86106ee0f323e3312feb4e211aaa131b37db7a02ffb6c9c3084e3 WatchSource:0}: Error finding container 6401376416a86106ee0f323e3312feb4e211aaa131b37db7a02ffb6c9c3084e3: Status 404 returned error can't find the container with id 6401376416a86106ee0f323e3312feb4e211aaa131b37db7a02ffb6c9c3084e3 Mar 20 16:06:17 crc kubenswrapper[4708]: I0320 16:06:17.034072 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 16:06:17 crc kubenswrapper[4708]: I0320 16:06:17.659510 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 16:06:17 crc kubenswrapper[4708]: I0320 16:06:17.774600 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 16:06:17 crc kubenswrapper[4708]: I0320 16:06:17.915235 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 16:06:17 crc kubenswrapper[4708]: I0320 16:06:17.981497 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ceb9d747801fe98bb422f15a855063e54852827b3bcb0cdc5dd9fe6472229aad"} Mar 20 16:06:17 crc kubenswrapper[4708]: I0320 16:06:17.981566 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"20a5996f5b4007abf2dd2ef0babdf2b88da03fab822611d372dd5edfa0cd3e5e"} Mar 20 16:06:17 crc kubenswrapper[4708]: I0320 16:06:17.983319 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"625e4ad7f4d948fd438b1c8479b8f6d2606e4bc46a5764b70f1d624459b6b7e6"} Mar 20 16:06:17 crc kubenswrapper[4708]: I0320 16:06:17.983352 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6401376416a86106ee0f323e3312feb4e211aaa131b37db7a02ffb6c9c3084e3"} Mar 20 16:06:17 crc kubenswrapper[4708]: I0320 16:06:17.988401 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3ab70875dfe265219a27a50949bac61be8f991239877c708e8169c8ddf768734"} Mar 20 16:06:17 crc kubenswrapper[4708]: I0320 16:06:17.988444 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"06bbb07e368ca64d1725c1582e4c91085df1766b24821b3a3037627a513eef28"} Mar 20 16:06:18 crc kubenswrapper[4708]: I0320 16:06:18.037335 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 16:06:18 crc kubenswrapper[4708]: I0320 16:06:18.185026 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 16:06:18 crc kubenswrapper[4708]: I0320 16:06:18.542416 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 16:06:18 crc kubenswrapper[4708]: I0320 16:06:18.982036 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 16:06:18 crc kubenswrapper[4708]: I0320 16:06:18.995814 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 20 16:06:18 crc kubenswrapper[4708]: I0320 16:06:18.995866 4708 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="625e4ad7f4d948fd438b1c8479b8f6d2606e4bc46a5764b70f1d624459b6b7e6" exitCode=255 Mar 20 16:06:18 crc kubenswrapper[4708]: I0320 16:06:18.995914 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"625e4ad7f4d948fd438b1c8479b8f6d2606e4bc46a5764b70f1d624459b6b7e6"} Mar 20 16:06:18 crc kubenswrapper[4708]: I0320 16:06:18.996251 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:06:18 crc kubenswrapper[4708]: I0320 16:06:18.996652 4708 scope.go:117] "RemoveContainer" containerID="625e4ad7f4d948fd438b1c8479b8f6d2606e4bc46a5764b70f1d624459b6b7e6" Mar 20 16:06:19 crc kubenswrapper[4708]: I0320 16:06:19.040405 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 16:06:19 crc kubenswrapper[4708]: I0320 16:06:19.075084 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 16:06:19 crc kubenswrapper[4708]: I0320 16:06:19.331766 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 16:06:19 crc kubenswrapper[4708]: I0320 16:06:19.364119 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 16:06:19 crc kubenswrapper[4708]: I0320 16:06:19.551857 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 16:06:20 crc kubenswrapper[4708]: I0320 16:06:20.014894 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 16:06:20 crc kubenswrapper[4708]: I0320 16:06:20.020426 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 16:06:20 crc kubenswrapper[4708]: I0320 16:06:20.254843 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 16:06:20 crc kubenswrapper[4708]: I0320 16:06:20.924599 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 16:06:21 crc kubenswrapper[4708]: I0320 16:06:21.048516 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 20 16:06:21 crc kubenswrapper[4708]: I0320 16:06:21.048582 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a92d2456ade7ea31cf1cf6104863960aa9256591c88fd9af7825f0aadac1868a"} Mar 20 16:06:21 crc kubenswrapper[4708]: I0320 16:06:21.072717 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 16:06:21 crc kubenswrapper[4708]: I0320 16:06:21.177722 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 16:06:21 crc kubenswrapper[4708]: I0320 16:06:21.511199 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 16:06:21 crc kubenswrapper[4708]: I0320 16:06:21.590645 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 16:06:21 crc kubenswrapper[4708]: I0320 16:06:21.742111 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 16:06:21 crc kubenswrapper[4708]: I0320 16:06:21.788806 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 16:06:21 crc kubenswrapper[4708]: I0320 16:06:21.892867 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.057110 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.057556 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.057600 4708 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="a92d2456ade7ea31cf1cf6104863960aa9256591c88fd9af7825f0aadac1868a" exitCode=255 Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.057635 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"a92d2456ade7ea31cf1cf6104863960aa9256591c88fd9af7825f0aadac1868a"} Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.057701 4708 scope.go:117] "RemoveContainer" containerID="625e4ad7f4d948fd438b1c8479b8f6d2606e4bc46a5764b70f1d624459b6b7e6" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.059080 4708 scope.go:117] "RemoveContainer" containerID="a92d2456ade7ea31cf1cf6104863960aa9256591c88fd9af7825f0aadac1868a" Mar 20 16:06:22 crc kubenswrapper[4708]: E0320 16:06:22.059373 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.071179 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.128740 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.181971 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.282646 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.577809 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.582121 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.594507 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.639852 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.832806 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.846878 4708 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 16:06:22 crc kubenswrapper[4708]: I0320 16:06:22.870100 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.007086 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.009905 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.063947 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.103839 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.195171 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.252054 4708 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.282277 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.368801 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.387385 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.395565 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.433973 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.442653 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.575418 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.602097 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.612829 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.634431 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.876111 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.909829 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.937496 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.989711 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 16:06:23 crc kubenswrapper[4708]: I0320 16:06:23.991016 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.001431 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.061963 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.102117 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.111774 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.213431 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.224812 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.243241 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.255542 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.267320 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.418896 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.451430 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.451722 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.453485 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.459200 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.472296 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.485132 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.819792 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.891989 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.904407 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 16:06:24 crc kubenswrapper[4708]: I0320 16:06:24.955480 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.075273 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.093600 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.196001 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.202976 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.256519 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.325524 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.334281 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.444536 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.487453 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.604060 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.627279 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.785315 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.801832 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.813538 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.822118 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.846507 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.870073 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 16:06:25 crc kubenswrapper[4708]: I0320 16:06:25.903033 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.052008 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.102087 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.124907 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.151197 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.328980 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.404450 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.427933 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.461236 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.522718 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.560157 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.562465 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.705447 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.705615 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.725120 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 16:06:26 crc kubenswrapper[4708]: I0320 16:06:26.755451 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.173899 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.372553 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.444851 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.531043 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.541610 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.556136 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.617131 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.628727 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.629642 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.635728 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.670996 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.716480 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.738318 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.741989 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.769917 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.880028 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.902874 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 16:06:27 crc kubenswrapper[4708]: I0320 16:06:27.924847 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.003175 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.029556 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.105749 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.130869 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.203158 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.324871 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.383882 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.388145 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.474983 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.492496 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.644200 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.708088 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.781569 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.831073 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.888405 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.928323 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 16:06:28 crc kubenswrapper[4708]: I0320 16:06:28.981646 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.117067 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.163471 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.216614 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.289995 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.349503 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.359070 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.558081 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.573762 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.603688 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.638492 4708 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.664602 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.685757 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.735881 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.864417 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.930747 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 16:06:29 crc kubenswrapper[4708]: I0320 16:06:29.959283 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.029883 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.074427 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.096397 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.107561 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.111824 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.140821 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.145075 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.146492 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.208616 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.230992 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.234931 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.264722 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.271073 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.359807 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.386648 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.589734 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.593619 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.621749 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.637618 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.651086 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.652201 4708 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.653314 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=42.653270903 podStartE2EDuration="42.653270903s" podCreationTimestamp="2026-03-20 16:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:06:07.907757278 +0000 UTC m=+322.582094003" watchObservedRunningTime="2026-03-20 16:06:30.653270903 +0000 UTC m=+345.327607638" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.658527 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.658590 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-infra/auto-csr-approver-29567046-99wtr"] Mar 20 16:06:30 crc kubenswrapper[4708]: E0320 16:06:30.658821 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebff241e-ba92-403f-991f-82c78be60f16" containerName="installer" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.658835 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebff241e-ba92-403f-991f-82c78be60f16" containerName="installer" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.658965 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebff241e-ba92-403f-991f-82c78be60f16" containerName="installer" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.659533 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-99wtr" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.661791 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.663631 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.663681 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.663950 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.681295 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.682755 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.682735076 podStartE2EDuration="23.682735076s" podCreationTimestamp="2026-03-20 16:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:06:30.678021462 +0000 UTC m=+345.352358257" watchObservedRunningTime="2026-03-20 16:06:30.682735076 +0000 UTC m=+345.357071791" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.700351 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.770578 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.841637 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8n7d\" (UniqueName: \"kubernetes.io/projected/cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb-kube-api-access-f8n7d\") pod \"auto-csr-approver-29567046-99wtr\" (UID: \"cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb\") " pod="openshift-infra/auto-csr-approver-29567046-99wtr" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.854447 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.861720 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.943289 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8n7d\" (UniqueName: \"kubernetes.io/projected/cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb-kube-api-access-f8n7d\") pod \"auto-csr-approver-29567046-99wtr\" (UID: \"cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb\") " pod="openshift-infra/auto-csr-approver-29567046-99wtr" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.962770 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8n7d\" (UniqueName: \"kubernetes.io/projected/cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb-kube-api-access-f8n7d\") pod \"auto-csr-approver-29567046-99wtr\" (UID: \"cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb\") " pod="openshift-infra/auto-csr-approver-29567046-99wtr" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.984908 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 16:06:30 crc kubenswrapper[4708]: I0320 16:06:30.988890 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-99wtr" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.005622 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.013002 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.015005 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.179841 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.246232 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.255022 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.275232 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.278309 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.389533 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-99wtr"] Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.422348 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.438346 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.453210 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.648473 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.673800 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.690520 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.721239 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.736245 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.793334 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.841279 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.841565 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.885435 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 16:06:31 crc kubenswrapper[4708]: I0320 16:06:31.967955 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.024482 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.061212 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.124730 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-99wtr" event={"ID":"cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb","Type":"ContainerStarted","Data":"6ecb3e46e5e83fe07076c2e4eaf2022dd75fdc18a52f2d4402504928c9164fd2"} Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.214692 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.423624 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.440608 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.458875 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.506663 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.508260 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.607839 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.695735 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.805702 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 16:06:32 crc kubenswrapper[4708]: I0320 16:06:32.904136 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 16:06:33 crc kubenswrapper[4708]: I0320 16:06:33.189685 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 16:06:33 crc kubenswrapper[4708]: I0320 16:06:33.310574 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 16:06:33 crc kubenswrapper[4708]: I0320 16:06:33.403956 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 16:06:33 crc kubenswrapper[4708]: I0320 16:06:33.603310 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 16:06:33 crc kubenswrapper[4708]: I0320 16:06:33.604011 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 16:06:33 crc kubenswrapper[4708]: I0320 16:06:33.713124 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 16:06:33 crc kubenswrapper[4708]: I0320 16:06:33.935276 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 16:06:33 crc kubenswrapper[4708]: I0320 16:06:33.939721 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 16:06:33 crc kubenswrapper[4708]: I0320 16:06:33.959547 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 16:06:34 crc kubenswrapper[4708]: I0320 16:06:34.077553 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 16:06:34 crc kubenswrapper[4708]: I0320 16:06:34.111640 4708 scope.go:117] "RemoveContainer" containerID="a92d2456ade7ea31cf1cf6104863960aa9256591c88fd9af7825f0aadac1868a" Mar 20 16:06:34 crc kubenswrapper[4708]: I0320 16:06:34.137199 4708 generic.go:334] "Generic (PLEG): container finished" podID="cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb" containerID="43eda6186390c28f2015dd7ac4c88acf45551a4ec2f89997a3a9fdc781204fd4" exitCode=0 Mar 20 16:06:34 crc kubenswrapper[4708]: I0320 16:06:34.137274 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-99wtr" event={"ID":"cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb","Type":"ContainerDied","Data":"43eda6186390c28f2015dd7ac4c88acf45551a4ec2f89997a3a9fdc781204fd4"} Mar 20 16:06:34 crc kubenswrapper[4708]: I0320 16:06:34.285833 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 16:06:34 crc kubenswrapper[4708]: I0320 16:06:34.345292 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 16:06:34 crc kubenswrapper[4708]: I0320 16:06:34.487707 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 16:06:34 crc kubenswrapper[4708]: I0320 16:06:34.545792 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 16:06:34 crc kubenswrapper[4708]: I0320 16:06:34.714439 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.072078 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.072183 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.073329 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.147287 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.147600 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"dc20b9c234220e77618a59d40944175d40510297c5dde79aee5d91a3bf8b0dba"} Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.215557 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.313221 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.418527 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-99wtr" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.466092 4708 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.497564 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.535591 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8n7d\" (UniqueName: \"kubernetes.io/projected/cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb-kube-api-access-f8n7d\") pod \"cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb\" (UID: \"cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb\") " Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.540406 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb-kube-api-access-f8n7d" (OuterVolumeSpecName: "kube-api-access-f8n7d") pod "cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb" (UID: "cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb"). InnerVolumeSpecName "kube-api-access-f8n7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.636917 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8n7d\" (UniqueName: \"kubernetes.io/projected/cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb-kube-api-access-f8n7d\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.794129 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 16:06:35 crc kubenswrapper[4708]: I0320 16:06:35.796228 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 16:06:36 crc kubenswrapper[4708]: I0320 16:06:36.155348 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-99wtr" event={"ID":"cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb","Type":"ContainerDied","Data":"6ecb3e46e5e83fe07076c2e4eaf2022dd75fdc18a52f2d4402504928c9164fd2"} Mar 20 16:06:36 crc kubenswrapper[4708]: I0320 16:06:36.155384 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-99wtr" Mar 20 16:06:36 crc kubenswrapper[4708]: I0320 16:06:36.155406 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ecb3e46e5e83fe07076c2e4eaf2022dd75fdc18a52f2d4402504928c9164fd2" Mar 20 16:06:36 crc kubenswrapper[4708]: I0320 16:06:36.222094 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 16:06:36 crc kubenswrapper[4708]: I0320 16:06:36.266484 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 16:06:37 crc kubenswrapper[4708]: I0320 16:06:37.073209 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 16:06:37 crc kubenswrapper[4708]: I0320 16:06:37.255506 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 16:06:37 crc kubenswrapper[4708]: I0320 16:06:37.946596 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 16:06:41 crc kubenswrapper[4708]: I0320 16:06:41.466623 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75d8c46c45-qtxrd"] Mar 20 16:06:41 crc kubenswrapper[4708]: I0320 16:06:41.467295 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" podUID="8485bd73-9fba-4cbd-8438-c43bd02d4dfa" containerName="controller-manager" containerID="cri-o://8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9" gracePeriod=30 Mar 20 16:06:41 crc kubenswrapper[4708]: I0320 16:06:41.577802 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v"] Mar 20 16:06:41 crc kubenswrapper[4708]: I0320 16:06:41.578074 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" podUID="eec8cee6-e26a-4b5c-b171-121a3f8b0547" containerName="route-controller-manager" containerID="cri-o://147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5" gracePeriod=30 Mar 20 16:06:41 crc kubenswrapper[4708]: I0320 16:06:41.666293 4708 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 16:06:41 crc kubenswrapper[4708]: I0320 16:06:41.666981 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://80073094dbd2447d70ee5eefbd5dc0dea9d3f5da0a31af09f495c9557bf81898" gracePeriod=5 Mar 20 16:06:41 crc kubenswrapper[4708]: I0320 16:06:41.856148 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:06:41 crc kubenswrapper[4708]: I0320 16:06:41.927073 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.018314 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-serving-cert\") pod \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.018631 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-proxy-ca-bundles\") pod \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.018776 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-client-ca\") pod \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.018820 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4qkp\" (UniqueName: \"kubernetes.io/projected/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-kube-api-access-l4qkp\") pod \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.018885 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eec8cee6-e26a-4b5c-b171-121a3f8b0547-client-ca\") pod \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.018962 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eec8cee6-e26a-4b5c-b171-121a3f8b0547-serving-cert\") pod \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.018995 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec8cee6-e26a-4b5c-b171-121a3f8b0547-config\") pod \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.019051 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg8gg\" (UniqueName: \"kubernetes.io/projected/eec8cee6-e26a-4b5c-b171-121a3f8b0547-kube-api-access-fg8gg\") pod \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\" (UID: \"eec8cee6-e26a-4b5c-b171-121a3f8b0547\") " Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.019070 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-config\") pod \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\" (UID: \"8485bd73-9fba-4cbd-8438-c43bd02d4dfa\") " Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.019585 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-client-ca" (OuterVolumeSpecName: "client-ca") pod "8485bd73-9fba-4cbd-8438-c43bd02d4dfa" (UID: "8485bd73-9fba-4cbd-8438-c43bd02d4dfa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.020180 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8485bd73-9fba-4cbd-8438-c43bd02d4dfa" (UID: "8485bd73-9fba-4cbd-8438-c43bd02d4dfa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.020446 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eec8cee6-e26a-4b5c-b171-121a3f8b0547-config" (OuterVolumeSpecName: "config") pod "eec8cee6-e26a-4b5c-b171-121a3f8b0547" (UID: "eec8cee6-e26a-4b5c-b171-121a3f8b0547"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.020347 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-config" (OuterVolumeSpecName: "config") pod "8485bd73-9fba-4cbd-8438-c43bd02d4dfa" (UID: "8485bd73-9fba-4cbd-8438-c43bd02d4dfa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.022196 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eec8cee6-e26a-4b5c-b171-121a3f8b0547-client-ca" (OuterVolumeSpecName: "client-ca") pod "eec8cee6-e26a-4b5c-b171-121a3f8b0547" (UID: "eec8cee6-e26a-4b5c-b171-121a3f8b0547"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.024007 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eec8cee6-e26a-4b5c-b171-121a3f8b0547-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eec8cee6-e26a-4b5c-b171-121a3f8b0547" (UID: "eec8cee6-e26a-4b5c-b171-121a3f8b0547"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.024118 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8485bd73-9fba-4cbd-8438-c43bd02d4dfa" (UID: "8485bd73-9fba-4cbd-8438-c43bd02d4dfa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.024461 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eec8cee6-e26a-4b5c-b171-121a3f8b0547-kube-api-access-fg8gg" (OuterVolumeSpecName: "kube-api-access-fg8gg") pod "eec8cee6-e26a-4b5c-b171-121a3f8b0547" (UID: "eec8cee6-e26a-4b5c-b171-121a3f8b0547"). InnerVolumeSpecName "kube-api-access-fg8gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.024764 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-kube-api-access-l4qkp" (OuterVolumeSpecName: "kube-api-access-l4qkp") pod "8485bd73-9fba-4cbd-8438-c43bd02d4dfa" (UID: "8485bd73-9fba-4cbd-8438-c43bd02d4dfa"). InnerVolumeSpecName "kube-api-access-l4qkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.120073 4708 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.120377 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.120391 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4qkp\" (UniqueName: \"kubernetes.io/projected/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-kube-api-access-l4qkp\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.120407 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eec8cee6-e26a-4b5c-b171-121a3f8b0547-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.120420 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eec8cee6-e26a-4b5c-b171-121a3f8b0547-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.120432 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec8cee6-e26a-4b5c-b171-121a3f8b0547-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.120443 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg8gg\" (UniqueName: \"kubernetes.io/projected/eec8cee6-e26a-4b5c-b171-121a3f8b0547-kube-api-access-fg8gg\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.120455 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.120465 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8485bd73-9fba-4cbd-8438-c43bd02d4dfa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.196841 4708 generic.go:334] "Generic (PLEG): container finished" podID="8485bd73-9fba-4cbd-8438-c43bd02d4dfa" containerID="8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9" exitCode=0 Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.196948 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" event={"ID":"8485bd73-9fba-4cbd-8438-c43bd02d4dfa","Type":"ContainerDied","Data":"8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9"} Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.196990 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" event={"ID":"8485bd73-9fba-4cbd-8438-c43bd02d4dfa","Type":"ContainerDied","Data":"a29f50e989980df66de3cfe87558ee70cf0d510d03e6394088966b3879272bf7"} Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.197019 4708 scope.go:117] "RemoveContainer" containerID="8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.197208 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75d8c46c45-qtxrd" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.205245 4708 generic.go:334] "Generic (PLEG): container finished" podID="eec8cee6-e26a-4b5c-b171-121a3f8b0547" containerID="147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5" exitCode=0 Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.206311 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" event={"ID":"eec8cee6-e26a-4b5c-b171-121a3f8b0547","Type":"ContainerDied","Data":"147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5"} Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.206383 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" event={"ID":"eec8cee6-e26a-4b5c-b171-121a3f8b0547","Type":"ContainerDied","Data":"65d20c9b612a438d441caf4dd99c6341bcce001c78bb7a4bdeef3d38eea8276a"} Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.206476 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.219365 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75d8c46c45-qtxrd"] Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.219562 4708 scope.go:117] "RemoveContainer" containerID="8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9" Mar 20 16:06:42 crc kubenswrapper[4708]: E0320 16:06:42.220468 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9\": container with ID starting with 8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9 not found: ID does not exist" containerID="8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.220523 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9"} err="failed to get container status \"8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9\": rpc error: code = NotFound desc = could not find container \"8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9\": container with ID starting with 8224d054186370a84e7e275d8fcce98d5db018670fde569d319d6a6f82b8b7e9 not found: ID does not exist" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.220553 4708 scope.go:117] "RemoveContainer" containerID="147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.222168 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-75d8c46c45-qtxrd"] Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.228449 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v"] Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.231765 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f76f8945d-f6l7v"] Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.239084 4708 scope.go:117] "RemoveContainer" containerID="147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5" Mar 20 16:06:42 crc kubenswrapper[4708]: E0320 16:06:42.239540 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5\": container with ID starting with 147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5 not found: ID does not exist" containerID="147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5" Mar 20 16:06:42 crc kubenswrapper[4708]: I0320 16:06:42.239575 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5"} err="failed to get container status \"147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5\": rpc error: code = NotFound desc = could not find container \"147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5\": container with ID starting with 147d3b60aeba45ba10f0504056540d51d32b9e46c1a2fd9c9578696ec4c6abb5 not found: ID does not exist" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.027332 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x"] Mar 20 16:06:43 crc kubenswrapper[4708]: E0320 16:06:43.029077 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb" containerName="oc" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.029121 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb" containerName="oc" Mar 20 16:06:43 crc kubenswrapper[4708]: E0320 16:06:43.029180 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eec8cee6-e26a-4b5c-b171-121a3f8b0547" containerName="route-controller-manager" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.029195 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="eec8cee6-e26a-4b5c-b171-121a3f8b0547" containerName="route-controller-manager" Mar 20 16:06:43 crc kubenswrapper[4708]: E0320 16:06:43.029214 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.029223 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 16:06:43 crc kubenswrapper[4708]: E0320 16:06:43.029234 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8485bd73-9fba-4cbd-8438-c43bd02d4dfa" containerName="controller-manager" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.029242 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="8485bd73-9fba-4cbd-8438-c43bd02d4dfa" containerName="controller-manager" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.029477 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="eec8cee6-e26a-4b5c-b171-121a3f8b0547" containerName="route-controller-manager" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.029504 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="8485bd73-9fba-4cbd-8438-c43bd02d4dfa" containerName="controller-manager" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.029530 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.029546 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb" containerName="oc" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.030142 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.033941 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.034732 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-client-ca\") pod \"route-controller-manager-6c8ffbd4fb-jxd6x\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.034759 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-serving-cert\") pod \"route-controller-manager-6c8ffbd4fb-jxd6x\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.034829 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-config\") pod \"route-controller-manager-6c8ffbd4fb-jxd6x\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.034852 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q87b\" (UniqueName: \"kubernetes.io/projected/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-kube-api-access-9q87b\") pod \"route-controller-manager-6c8ffbd4fb-jxd6x\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.037289 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.037499 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.037698 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.037838 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.037975 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.038392 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cbf8bb65-fl828"] Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.039902 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.042147 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.042276 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.042653 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.042662 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.042942 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.047085 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x"] Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.049489 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.053035 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cbf8bb65-fl828"] Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.058155 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.135477 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-client-ca\") pod \"route-controller-manager-6c8ffbd4fb-jxd6x\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.135517 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-serving-cert\") pod \"route-controller-manager-6c8ffbd4fb-jxd6x\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.135563 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-proxy-ca-bundles\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.135591 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-config\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.135611 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlxlv\" (UniqueName: \"kubernetes.io/projected/d6e9031d-0652-4bef-9e25-bafa28c521d2-kube-api-access-dlxlv\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.135662 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e9031d-0652-4bef-9e25-bafa28c521d2-serving-cert\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.135902 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-config\") pod \"route-controller-manager-6c8ffbd4fb-jxd6x\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.135953 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q87b\" (UniqueName: \"kubernetes.io/projected/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-kube-api-access-9q87b\") pod \"route-controller-manager-6c8ffbd4fb-jxd6x\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.136056 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-client-ca\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.136562 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-client-ca\") pod \"route-controller-manager-6c8ffbd4fb-jxd6x\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.138332 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-config\") pod \"route-controller-manager-6c8ffbd4fb-jxd6x\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.139306 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-serving-cert\") pod \"route-controller-manager-6c8ffbd4fb-jxd6x\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.150177 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q87b\" (UniqueName: \"kubernetes.io/projected/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-kube-api-access-9q87b\") pod \"route-controller-manager-6c8ffbd4fb-jxd6x\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.236887 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-proxy-ca-bundles\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.236940 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-config\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.236966 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlxlv\" (UniqueName: \"kubernetes.io/projected/d6e9031d-0652-4bef-9e25-bafa28c521d2-kube-api-access-dlxlv\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.236990 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e9031d-0652-4bef-9e25-bafa28c521d2-serving-cert\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.237024 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-client-ca\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.237886 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-client-ca\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.238472 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-config\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.238783 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-proxy-ca-bundles\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.240732 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e9031d-0652-4bef-9e25-bafa28c521d2-serving-cert\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.251754 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlxlv\" (UniqueName: \"kubernetes.io/projected/d6e9031d-0652-4bef-9e25-bafa28c521d2-kube-api-access-dlxlv\") pod \"controller-manager-7cbf8bb65-fl828\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.376251 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.386943 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.778257 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x"] Mar 20 16:06:43 crc kubenswrapper[4708]: I0320 16:06:43.835221 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cbf8bb65-fl828"] Mar 20 16:06:43 crc kubenswrapper[4708]: W0320 16:06:43.838841 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6e9031d_0652_4bef_9e25_bafa28c521d2.slice/crio-c92efb8804a3f108c03dbd8a88597b09ac0792e70db4bf177f19bea6c0b41b72 WatchSource:0}: Error finding container c92efb8804a3f108c03dbd8a88597b09ac0792e70db4bf177f19bea6c0b41b72: Status 404 returned error can't find the container with id c92efb8804a3f108c03dbd8a88597b09ac0792e70db4bf177f19bea6c0b41b72 Mar 20 16:06:44 crc kubenswrapper[4708]: I0320 16:06:44.117962 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8485bd73-9fba-4cbd-8438-c43bd02d4dfa" path="/var/lib/kubelet/pods/8485bd73-9fba-4cbd-8438-c43bd02d4dfa/volumes" Mar 20 16:06:44 crc kubenswrapper[4708]: I0320 16:06:44.118718 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eec8cee6-e26a-4b5c-b171-121a3f8b0547" path="/var/lib/kubelet/pods/eec8cee6-e26a-4b5c-b171-121a3f8b0547/volumes" Mar 20 16:06:44 crc kubenswrapper[4708]: I0320 16:06:44.219986 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" event={"ID":"9f0a8d59-e0e0-448e-9563-13bc99b1bf36","Type":"ContainerStarted","Data":"f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e"} Mar 20 16:06:44 crc kubenswrapper[4708]: I0320 16:06:44.220035 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" event={"ID":"9f0a8d59-e0e0-448e-9563-13bc99b1bf36","Type":"ContainerStarted","Data":"87e5a00da03011d35e27de2f8bf7fede2d53e42d1cd548b25dfb079e8208a495"} Mar 20 16:06:44 crc kubenswrapper[4708]: I0320 16:06:44.221025 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:44 crc kubenswrapper[4708]: I0320 16:06:44.222604 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" event={"ID":"d6e9031d-0652-4bef-9e25-bafa28c521d2","Type":"ContainerStarted","Data":"bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9"} Mar 20 16:06:44 crc kubenswrapper[4708]: I0320 16:06:44.222633 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" event={"ID":"d6e9031d-0652-4bef-9e25-bafa28c521d2","Type":"ContainerStarted","Data":"c92efb8804a3f108c03dbd8a88597b09ac0792e70db4bf177f19bea6c0b41b72"} Mar 20 16:06:44 crc kubenswrapper[4708]: I0320 16:06:44.223262 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:44 crc kubenswrapper[4708]: I0320 16:06:44.237727 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:06:44 crc kubenswrapper[4708]: I0320 16:06:44.246162 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" podStartSLOduration=3.246141117 podStartE2EDuration="3.246141117s" podCreationTimestamp="2026-03-20 16:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:06:44.242525005 +0000 UTC m=+358.916861720" watchObservedRunningTime="2026-03-20 16:06:44.246141117 +0000 UTC m=+358.920477832" Mar 20 16:06:44 crc kubenswrapper[4708]: I0320 16:06:44.429128 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:06:44 crc kubenswrapper[4708]: I0320 16:06:44.446559 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" podStartSLOduration=3.44653917 podStartE2EDuration="3.44653917s" podCreationTimestamp="2026-03-20 16:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:06:44.264427494 +0000 UTC m=+358.938764209" watchObservedRunningTime="2026-03-20 16:06:44.44653917 +0000 UTC m=+359.120875885" Mar 20 16:06:46 crc kubenswrapper[4708]: I0320 16:06:46.796727 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 16:06:46 crc kubenswrapper[4708]: I0320 16:06:46.797312 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:46 crc kubenswrapper[4708]: I0320 16:06:46.911411 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 16:06:46 crc kubenswrapper[4708]: I0320 16:06:46.911476 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 16:06:46 crc kubenswrapper[4708]: I0320 16:06:46.911532 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 16:06:46 crc kubenswrapper[4708]: I0320 16:06:46.911587 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 16:06:46 crc kubenswrapper[4708]: I0320 16:06:46.911621 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 16:06:46 crc kubenswrapper[4708]: I0320 16:06:46.911878 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:06:46 crc kubenswrapper[4708]: I0320 16:06:46.911915 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:06:46 crc kubenswrapper[4708]: I0320 16:06:46.912252 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:06:46 crc kubenswrapper[4708]: I0320 16:06:46.912297 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:06:46 crc kubenswrapper[4708]: I0320 16:06:46.926091 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.012943 4708 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.012984 4708 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.012992 4708 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.013001 4708 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.013009 4708 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.240799 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.241202 4708 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="80073094dbd2447d70ee5eefbd5dc0dea9d3f5da0a31af09f495c9557bf81898" exitCode=137 Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.241269 4708 scope.go:117] "RemoveContainer" containerID="80073094dbd2447d70ee5eefbd5dc0dea9d3f5da0a31af09f495c9557bf81898" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.241289 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.265429 4708 scope.go:117] "RemoveContainer" containerID="80073094dbd2447d70ee5eefbd5dc0dea9d3f5da0a31af09f495c9557bf81898" Mar 20 16:06:47 crc kubenswrapper[4708]: E0320 16:06:47.267059 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80073094dbd2447d70ee5eefbd5dc0dea9d3f5da0a31af09f495c9557bf81898\": container with ID starting with 80073094dbd2447d70ee5eefbd5dc0dea9d3f5da0a31af09f495c9557bf81898 not found: ID does not exist" containerID="80073094dbd2447d70ee5eefbd5dc0dea9d3f5da0a31af09f495c9557bf81898" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.267094 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80073094dbd2447d70ee5eefbd5dc0dea9d3f5da0a31af09f495c9557bf81898"} err="failed to get container status \"80073094dbd2447d70ee5eefbd5dc0dea9d3f5da0a31af09f495c9557bf81898\": rpc error: code = NotFound desc = could not find container \"80073094dbd2447d70ee5eefbd5dc0dea9d3f5da0a31af09f495c9557bf81898\": container with ID starting with 80073094dbd2447d70ee5eefbd5dc0dea9d3f5da0a31af09f495c9557bf81898 not found: ID does not exist" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.572333 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h4kcq"] Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.572827 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h4kcq" podUID="ca697a79-760d-4ae1-827f-bc2b0aee1785" containerName="registry-server" containerID="cri-o://e8dbbe84fc93f1ae74677ceebfb74b3d3cc8538080306d1efb7bdb4bae852489" gracePeriod=30 Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.576227 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vzncs"] Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.576617 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vzncs" podUID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" containerName="registry-server" containerID="cri-o://559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac" gracePeriod=30 Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.597328 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6h42j"] Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.597871 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" podUID="4437a79c-97de-40ff-a2fa-d29cc2f86828" containerName="marketplace-operator" containerID="cri-o://aaab20e4741bdb770e63c33a3ab8ab4cb220a2cba1042f4bb09df1578443eab7" gracePeriod=30 Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.620138 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gczfl"] Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.621372 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.641570 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hvrxr"] Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.641919 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hvrxr" podUID="bece7d1b-b5d8-4762-b0b8-b2752c422776" containerName="registry-server" containerID="cri-o://f2a34addac5a89b21226f95fb7b2d9084d247826891a506d82e1cd76b96ec49a" gracePeriod=30 Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.653574 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gczfl"] Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.661106 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dk7n9"] Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.661631 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dk7n9" podUID="65dd12f7-e1af-4213-b264-d846c01eaba8" containerName="registry-server" containerID="cri-o://990ce66c456a25f1e9c43791a2bc32a15760f51b4c2b988e1086fed7550e3004" gracePeriod=30 Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.722610 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvb6p\" (UniqueName: \"kubernetes.io/projected/8dbd2786-5db6-431f-8d8a-ca115e65df27-kube-api-access-cvb6p\") pod \"marketplace-operator-79b997595-gczfl\" (UID: \"8dbd2786-5db6-431f-8d8a-ca115e65df27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.722662 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8dbd2786-5db6-431f-8d8a-ca115e65df27-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gczfl\" (UID: \"8dbd2786-5db6-431f-8d8a-ca115e65df27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.722707 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dbd2786-5db6-431f-8d8a-ca115e65df27-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gczfl\" (UID: \"8dbd2786-5db6-431f-8d8a-ca115e65df27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.824264 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvb6p\" (UniqueName: \"kubernetes.io/projected/8dbd2786-5db6-431f-8d8a-ca115e65df27-kube-api-access-cvb6p\") pod \"marketplace-operator-79b997595-gczfl\" (UID: \"8dbd2786-5db6-431f-8d8a-ca115e65df27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.824332 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8dbd2786-5db6-431f-8d8a-ca115e65df27-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gczfl\" (UID: \"8dbd2786-5db6-431f-8d8a-ca115e65df27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.824353 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dbd2786-5db6-431f-8d8a-ca115e65df27-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gczfl\" (UID: \"8dbd2786-5db6-431f-8d8a-ca115e65df27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.828365 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dbd2786-5db6-431f-8d8a-ca115e65df27-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gczfl\" (UID: \"8dbd2786-5db6-431f-8d8a-ca115e65df27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.830496 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8dbd2786-5db6-431f-8d8a-ca115e65df27-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gczfl\" (UID: \"8dbd2786-5db6-431f-8d8a-ca115e65df27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.843608 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvb6p\" (UniqueName: \"kubernetes.io/projected/8dbd2786-5db6-431f-8d8a-ca115e65df27-kube-api-access-cvb6p\") pod \"marketplace-operator-79b997595-gczfl\" (UID: \"8dbd2786-5db6-431f-8d8a-ca115e65df27\") " pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:47 crc kubenswrapper[4708]: I0320 16:06:47.951413 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:47 crc kubenswrapper[4708]: E0320 16:06:47.993110 4708 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f2a34addac5a89b21226f95fb7b2d9084d247826891a506d82e1cd76b96ec49a is running failed: container process not found" containerID="f2a34addac5a89b21226f95fb7b2d9084d247826891a506d82e1cd76b96ec49a" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 16:06:47 crc kubenswrapper[4708]: E0320 16:06:47.994176 4708 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f2a34addac5a89b21226f95fb7b2d9084d247826891a506d82e1cd76b96ec49a is running failed: container process not found" containerID="f2a34addac5a89b21226f95fb7b2d9084d247826891a506d82e1cd76b96ec49a" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 16:06:47 crc kubenswrapper[4708]: E0320 16:06:47.994436 4708 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f2a34addac5a89b21226f95fb7b2d9084d247826891a506d82e1cd76b96ec49a is running failed: container process not found" containerID="f2a34addac5a89b21226f95fb7b2d9084d247826891a506d82e1cd76b96ec49a" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 16:06:47 crc kubenswrapper[4708]: E0320 16:06:47.994477 4708 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f2a34addac5a89b21226f95fb7b2d9084d247826891a506d82e1cd76b96ec49a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-hvrxr" podUID="bece7d1b-b5d8-4762-b0b8-b2752c422776" containerName="registry-server" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.084439 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.128990 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2vz6\" (UniqueName: \"kubernetes.io/projected/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-kube-api-access-n2vz6\") pod \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\" (UID: \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.129091 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-utilities\") pod \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\" (UID: \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.129167 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-catalog-content\") pod \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\" (UID: \"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.129609 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.134095 4708 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.134237 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-kube-api-access-n2vz6" (OuterVolumeSpecName: "kube-api-access-n2vz6") pod "0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" (UID: "0d8b3c28-42d0-479b-b45d-0fe00b8cb36d"). InnerVolumeSpecName "kube-api-access-n2vz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.134586 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-utilities" (OuterVolumeSpecName: "utilities") pod "0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" (UID: "0d8b3c28-42d0-479b-b45d-0fe00b8cb36d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.169283 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.169321 4708 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1d9d5a01-5dc6-40ac-912d-31a0b434f428" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.195894 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.195951 4708 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1d9d5a01-5dc6-40ac-912d-31a0b434f428" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.207592 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" (UID: "0d8b3c28-42d0-479b-b45d-0fe00b8cb36d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.231892 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2vz6\" (UniqueName: \"kubernetes.io/projected/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-kube-api-access-n2vz6\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.231937 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.231951 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.262178 4708 generic.go:334] "Generic (PLEG): container finished" podID="bece7d1b-b5d8-4762-b0b8-b2752c422776" containerID="f2a34addac5a89b21226f95fb7b2d9084d247826891a506d82e1cd76b96ec49a" exitCode=0 Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.262295 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvrxr" event={"ID":"bece7d1b-b5d8-4762-b0b8-b2752c422776","Type":"ContainerDied","Data":"f2a34addac5a89b21226f95fb7b2d9084d247826891a506d82e1cd76b96ec49a"} Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.263959 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.271617 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.273911 4708 generic.go:334] "Generic (PLEG): container finished" podID="ca697a79-760d-4ae1-827f-bc2b0aee1785" containerID="e8dbbe84fc93f1ae74677ceebfb74b3d3cc8538080306d1efb7bdb4bae852489" exitCode=0 Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.274165 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4kcq" event={"ID":"ca697a79-760d-4ae1-827f-bc2b0aee1785","Type":"ContainerDied","Data":"e8dbbe84fc93f1ae74677ceebfb74b3d3cc8538080306d1efb7bdb4bae852489"} Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.294224 4708 generic.go:334] "Generic (PLEG): container finished" podID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" containerID="559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac" exitCode=0 Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.294317 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzncs" event={"ID":"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d","Type":"ContainerDied","Data":"559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac"} Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.294361 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzncs" event={"ID":"0d8b3c28-42d0-479b-b45d-0fe00b8cb36d","Type":"ContainerDied","Data":"4dde2d6ed803822ae4482f52062a9d09c8974be71c65867d65b589b43ea7e430"} Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.294381 4708 scope.go:117] "RemoveContainer" containerID="559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.296325 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzncs" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.313655 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.313862 4708 generic.go:334] "Generic (PLEG): container finished" podID="65dd12f7-e1af-4213-b264-d846c01eaba8" containerID="990ce66c456a25f1e9c43791a2bc32a15760f51b4c2b988e1086fed7550e3004" exitCode=0 Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.314026 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dk7n9" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.314577 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk7n9" event={"ID":"65dd12f7-e1af-4213-b264-d846c01eaba8","Type":"ContainerDied","Data":"990ce66c456a25f1e9c43791a2bc32a15760f51b4c2b988e1086fed7550e3004"} Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.318430 4708 generic.go:334] "Generic (PLEG): container finished" podID="4437a79c-97de-40ff-a2fa-d29cc2f86828" containerID="aaab20e4741bdb770e63c33a3ab8ab4cb220a2cba1042f4bb09df1578443eab7" exitCode=0 Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.318489 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" event={"ID":"4437a79c-97de-40ff-a2fa-d29cc2f86828","Type":"ContainerDied","Data":"aaab20e4741bdb770e63c33a3ab8ab4cb220a2cba1042f4bb09df1578443eab7"} Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.323733 4708 scope.go:117] "RemoveContainer" containerID="da500285cf7e4fe6bc601347a95f51f1d90603f638db9f179c7d9f1be108db93" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.327002 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.337528 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bece7d1b-b5d8-4762-b0b8-b2752c422776-utilities\") pod \"bece7d1b-b5d8-4762-b0b8-b2752c422776\" (UID: \"bece7d1b-b5d8-4762-b0b8-b2752c422776\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.337567 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dd12f7-e1af-4213-b264-d846c01eaba8-utilities\") pod \"65dd12f7-e1af-4213-b264-d846c01eaba8\" (UID: \"65dd12f7-e1af-4213-b264-d846c01eaba8\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.337612 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dd12f7-e1af-4213-b264-d846c01eaba8-catalog-content\") pod \"65dd12f7-e1af-4213-b264-d846c01eaba8\" (UID: \"65dd12f7-e1af-4213-b264-d846c01eaba8\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.337647 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwtbf\" (UniqueName: \"kubernetes.io/projected/65dd12f7-e1af-4213-b264-d846c01eaba8-kube-api-access-bwtbf\") pod \"65dd12f7-e1af-4213-b264-d846c01eaba8\" (UID: \"65dd12f7-e1af-4213-b264-d846c01eaba8\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.337715 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca697a79-760d-4ae1-827f-bc2b0aee1785-utilities\") pod \"ca697a79-760d-4ae1-827f-bc2b0aee1785\" (UID: \"ca697a79-760d-4ae1-827f-bc2b0aee1785\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.338099 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9mnp\" (UniqueName: \"kubernetes.io/projected/bece7d1b-b5d8-4762-b0b8-b2752c422776-kube-api-access-m9mnp\") pod \"bece7d1b-b5d8-4762-b0b8-b2752c422776\" (UID: \"bece7d1b-b5d8-4762-b0b8-b2752c422776\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.338642 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca697a79-760d-4ae1-827f-bc2b0aee1785-utilities" (OuterVolumeSpecName: "utilities") pod "ca697a79-760d-4ae1-827f-bc2b0aee1785" (UID: "ca697a79-760d-4ae1-827f-bc2b0aee1785"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.339166 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bece7d1b-b5d8-4762-b0b8-b2752c422776-utilities" (OuterVolumeSpecName: "utilities") pod "bece7d1b-b5d8-4762-b0b8-b2752c422776" (UID: "bece7d1b-b5d8-4762-b0b8-b2752c422776"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.339995 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh56j\" (UniqueName: \"kubernetes.io/projected/ca697a79-760d-4ae1-827f-bc2b0aee1785-kube-api-access-wh56j\") pod \"ca697a79-760d-4ae1-827f-bc2b0aee1785\" (UID: \"ca697a79-760d-4ae1-827f-bc2b0aee1785\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.340621 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca697a79-760d-4ae1-827f-bc2b0aee1785-catalog-content\") pod \"ca697a79-760d-4ae1-827f-bc2b0aee1785\" (UID: \"ca697a79-760d-4ae1-827f-bc2b0aee1785\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.340659 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bece7d1b-b5d8-4762-b0b8-b2752c422776-catalog-content\") pod \"bece7d1b-b5d8-4762-b0b8-b2752c422776\" (UID: \"bece7d1b-b5d8-4762-b0b8-b2752c422776\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.342541 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca697a79-760d-4ae1-827f-bc2b0aee1785-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.342560 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bece7d1b-b5d8-4762-b0b8-b2752c422776-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.361786 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65dd12f7-e1af-4213-b264-d846c01eaba8-utilities" (OuterVolumeSpecName: "utilities") pod "65dd12f7-e1af-4213-b264-d846c01eaba8" (UID: "65dd12f7-e1af-4213-b264-d846c01eaba8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.363887 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65dd12f7-e1af-4213-b264-d846c01eaba8-kube-api-access-bwtbf" (OuterVolumeSpecName: "kube-api-access-bwtbf") pod "65dd12f7-e1af-4213-b264-d846c01eaba8" (UID: "65dd12f7-e1af-4213-b264-d846c01eaba8"). InnerVolumeSpecName "kube-api-access-bwtbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.364741 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca697a79-760d-4ae1-827f-bc2b0aee1785-kube-api-access-wh56j" (OuterVolumeSpecName: "kube-api-access-wh56j") pod "ca697a79-760d-4ae1-827f-bc2b0aee1785" (UID: "ca697a79-760d-4ae1-827f-bc2b0aee1785"). InnerVolumeSpecName "kube-api-access-wh56j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.371333 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bece7d1b-b5d8-4762-b0b8-b2752c422776-kube-api-access-m9mnp" (OuterVolumeSpecName: "kube-api-access-m9mnp") pod "bece7d1b-b5d8-4762-b0b8-b2752c422776" (UID: "bece7d1b-b5d8-4762-b0b8-b2752c422776"). InnerVolumeSpecName "kube-api-access-m9mnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.374144 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bece7d1b-b5d8-4762-b0b8-b2752c422776-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bece7d1b-b5d8-4762-b0b8-b2752c422776" (UID: "bece7d1b-b5d8-4762-b0b8-b2752c422776"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.374601 4708 scope.go:117] "RemoveContainer" containerID="626cdf56fe1ed0f33f343354174d8af818cbe2b8508d6469de0ad05fa22ee82e" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.377306 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vzncs"] Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.385039 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vzncs"] Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.395818 4708 scope.go:117] "RemoveContainer" containerID="559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac" Mar 20 16:06:48 crc kubenswrapper[4708]: E0320 16:06:48.396464 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac\": container with ID starting with 559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac not found: ID does not exist" containerID="559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.396609 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac"} err="failed to get container status \"559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac\": rpc error: code = NotFound desc = could not find container \"559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac\": container with ID starting with 559d3571ed1dc4ed08b8dcdbb0a9391dec3b376f6b85d60120b4088dd8241bac not found: ID does not exist" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.396748 4708 scope.go:117] "RemoveContainer" containerID="da500285cf7e4fe6bc601347a95f51f1d90603f638db9f179c7d9f1be108db93" Mar 20 16:06:48 crc kubenswrapper[4708]: E0320 16:06:48.397531 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da500285cf7e4fe6bc601347a95f51f1d90603f638db9f179c7d9f1be108db93\": container with ID starting with da500285cf7e4fe6bc601347a95f51f1d90603f638db9f179c7d9f1be108db93 not found: ID does not exist" containerID="da500285cf7e4fe6bc601347a95f51f1d90603f638db9f179c7d9f1be108db93" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.397577 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da500285cf7e4fe6bc601347a95f51f1d90603f638db9f179c7d9f1be108db93"} err="failed to get container status \"da500285cf7e4fe6bc601347a95f51f1d90603f638db9f179c7d9f1be108db93\": rpc error: code = NotFound desc = could not find container \"da500285cf7e4fe6bc601347a95f51f1d90603f638db9f179c7d9f1be108db93\": container with ID starting with da500285cf7e4fe6bc601347a95f51f1d90603f638db9f179c7d9f1be108db93 not found: ID does not exist" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.397606 4708 scope.go:117] "RemoveContainer" containerID="626cdf56fe1ed0f33f343354174d8af818cbe2b8508d6469de0ad05fa22ee82e" Mar 20 16:06:48 crc kubenswrapper[4708]: E0320 16:06:48.397891 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626cdf56fe1ed0f33f343354174d8af818cbe2b8508d6469de0ad05fa22ee82e\": container with ID starting with 626cdf56fe1ed0f33f343354174d8af818cbe2b8508d6469de0ad05fa22ee82e not found: ID does not exist" containerID="626cdf56fe1ed0f33f343354174d8af818cbe2b8508d6469de0ad05fa22ee82e" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.397910 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626cdf56fe1ed0f33f343354174d8af818cbe2b8508d6469de0ad05fa22ee82e"} err="failed to get container status \"626cdf56fe1ed0f33f343354174d8af818cbe2b8508d6469de0ad05fa22ee82e\": rpc error: code = NotFound desc = could not find container \"626cdf56fe1ed0f33f343354174d8af818cbe2b8508d6469de0ad05fa22ee82e\": container with ID starting with 626cdf56fe1ed0f33f343354174d8af818cbe2b8508d6469de0ad05fa22ee82e not found: ID does not exist" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.397934 4708 scope.go:117] "RemoveContainer" containerID="990ce66c456a25f1e9c43791a2bc32a15760f51b4c2b988e1086fed7550e3004" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.413288 4708 scope.go:117] "RemoveContainer" containerID="e3c1ff683b9b4f531f0a21098f0c453ea353c1e9a70077dd8c40d3f4990f374d" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.420317 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca697a79-760d-4ae1-827f-bc2b0aee1785-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca697a79-760d-4ae1-827f-bc2b0aee1785" (UID: "ca697a79-760d-4ae1-827f-bc2b0aee1785"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.438742 4708 scope.go:117] "RemoveContainer" containerID="117d1ee6ba475b4ba00e4179f031bc9d85e357e32c3f526dd7b4f838a090027a" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.443460 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4437a79c-97de-40ff-a2fa-d29cc2f86828-marketplace-operator-metrics\") pod \"4437a79c-97de-40ff-a2fa-d29cc2f86828\" (UID: \"4437a79c-97de-40ff-a2fa-d29cc2f86828\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.443743 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4437a79c-97de-40ff-a2fa-d29cc2f86828-marketplace-trusted-ca\") pod \"4437a79c-97de-40ff-a2fa-d29cc2f86828\" (UID: \"4437a79c-97de-40ff-a2fa-d29cc2f86828\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.443873 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8swt\" (UniqueName: \"kubernetes.io/projected/4437a79c-97de-40ff-a2fa-d29cc2f86828-kube-api-access-w8swt\") pod \"4437a79c-97de-40ff-a2fa-d29cc2f86828\" (UID: \"4437a79c-97de-40ff-a2fa-d29cc2f86828\") " Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.444279 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dd12f7-e1af-4213-b264-d846c01eaba8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.444445 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwtbf\" (UniqueName: \"kubernetes.io/projected/65dd12f7-e1af-4213-b264-d846c01eaba8-kube-api-access-bwtbf\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.444596 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9mnp\" (UniqueName: \"kubernetes.io/projected/bece7d1b-b5d8-4762-b0b8-b2752c422776-kube-api-access-m9mnp\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.444780 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh56j\" (UniqueName: \"kubernetes.io/projected/ca697a79-760d-4ae1-827f-bc2b0aee1785-kube-api-access-wh56j\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.444892 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca697a79-760d-4ae1-827f-bc2b0aee1785-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.444976 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bece7d1b-b5d8-4762-b0b8-b2752c422776-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.445319 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4437a79c-97de-40ff-a2fa-d29cc2f86828-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4437a79c-97de-40ff-a2fa-d29cc2f86828" (UID: "4437a79c-97de-40ff-a2fa-d29cc2f86828"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.448432 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4437a79c-97de-40ff-a2fa-d29cc2f86828-kube-api-access-w8swt" (OuterVolumeSpecName: "kube-api-access-w8swt") pod "4437a79c-97de-40ff-a2fa-d29cc2f86828" (UID: "4437a79c-97de-40ff-a2fa-d29cc2f86828"). InnerVolumeSpecName "kube-api-access-w8swt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.448495 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4437a79c-97de-40ff-a2fa-d29cc2f86828-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4437a79c-97de-40ff-a2fa-d29cc2f86828" (UID: "4437a79c-97de-40ff-a2fa-d29cc2f86828"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.528962 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65dd12f7-e1af-4213-b264-d846c01eaba8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65dd12f7-e1af-4213-b264-d846c01eaba8" (UID: "65dd12f7-e1af-4213-b264-d846c01eaba8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.546317 4708 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4437a79c-97de-40ff-a2fa-d29cc2f86828-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.546362 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8swt\" (UniqueName: \"kubernetes.io/projected/4437a79c-97de-40ff-a2fa-d29cc2f86828-kube-api-access-w8swt\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.546374 4708 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4437a79c-97de-40ff-a2fa-d29cc2f86828-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.546386 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dd12f7-e1af-4213-b264-d846c01eaba8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.549277 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gczfl"] Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.645112 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dk7n9"] Mar 20 16:06:48 crc kubenswrapper[4708]: I0320 16:06:48.648823 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dk7n9"] Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.327412 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" event={"ID":"8dbd2786-5db6-431f-8d8a-ca115e65df27","Type":"ContainerStarted","Data":"01705c6bafadd83f3b6386a339c2e11235d3430063b5f736db757a87507123f8"} Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.327503 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" event={"ID":"8dbd2786-5db6-431f-8d8a-ca115e65df27","Type":"ContainerStarted","Data":"f0378e87a54dbb5b10f5d03509f0768348a782d86d5784278079f3022ab0cbd3"} Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.327545 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.331893 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.331908 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6h42j" event={"ID":"4437a79c-97de-40ff-a2fa-d29cc2f86828","Type":"ContainerDied","Data":"9baf985b21d59446d3a7141ba543e84f72f499f354436709458db0e324d3e245"} Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.332042 4708 scope.go:117] "RemoveContainer" containerID="aaab20e4741bdb770e63c33a3ab8ab4cb220a2cba1042f4bb09df1578443eab7" Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.334392 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.335869 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hvrxr" event={"ID":"bece7d1b-b5d8-4762-b0b8-b2752c422776","Type":"ContainerDied","Data":"04c15eb141f571ab55102f7edba5231c4e5bc5373fc6b2007d86d88066098acb"} Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.335995 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hvrxr" Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.344527 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4kcq" event={"ID":"ca697a79-760d-4ae1-827f-bc2b0aee1785","Type":"ContainerDied","Data":"1787afc0948237b24897fb97632cc654ac801b39ec4592412ad028488155510f"} Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.344790 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h4kcq" Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.357374 4708 scope.go:117] "RemoveContainer" containerID="f2a34addac5a89b21226f95fb7b2d9084d247826891a506d82e1cd76b96ec49a" Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.360088 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gczfl" podStartSLOduration=2.360066415 podStartE2EDuration="2.360066415s" podCreationTimestamp="2026-03-20 16:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:06:49.356638698 +0000 UTC m=+364.030975433" watchObservedRunningTime="2026-03-20 16:06:49.360066415 +0000 UTC m=+364.034403130" Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.386164 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6h42j"] Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.389633 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6h42j"] Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.395675 4708 scope.go:117] "RemoveContainer" containerID="8d8460ba1a16625147ea48db10e794e885cd0575b6f67ab586295f45bdad869a" Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.431640 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h4kcq"] Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.438707 4708 scope.go:117] "RemoveContainer" containerID="04d6f12446a266105a35664aef9b5a145b0216aa077bdd34a3b73490ec0e4c6c" Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.446742 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h4kcq"] Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.451650 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hvrxr"] Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.456500 4708 scope.go:117] "RemoveContainer" containerID="e8dbbe84fc93f1ae74677ceebfb74b3d3cc8538080306d1efb7bdb4bae852489" Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.457266 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hvrxr"] Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.476889 4708 scope.go:117] "RemoveContainer" containerID="ae8e68f00501ea8d8f0dbd6bdea9c80b0f6885a954ebd4415f744adb99d9b9cb" Mar 20 16:06:49 crc kubenswrapper[4708]: I0320 16:06:49.493080 4708 scope.go:117] "RemoveContainer" containerID="60a31673c741948f3ed4bfb1fc52b9f3930b2cf6b42784e5a9ea24652e9f351b" Mar 20 16:06:50 crc kubenswrapper[4708]: I0320 16:06:50.119201 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" path="/var/lib/kubelet/pods/0d8b3c28-42d0-479b-b45d-0fe00b8cb36d/volumes" Mar 20 16:06:50 crc kubenswrapper[4708]: I0320 16:06:50.120155 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4437a79c-97de-40ff-a2fa-d29cc2f86828" path="/var/lib/kubelet/pods/4437a79c-97de-40ff-a2fa-d29cc2f86828/volumes" Mar 20 16:06:50 crc kubenswrapper[4708]: I0320 16:06:50.120621 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65dd12f7-e1af-4213-b264-d846c01eaba8" path="/var/lib/kubelet/pods/65dd12f7-e1af-4213-b264-d846c01eaba8/volumes" Mar 20 16:06:50 crc kubenswrapper[4708]: I0320 16:06:50.121736 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bece7d1b-b5d8-4762-b0b8-b2752c422776" path="/var/lib/kubelet/pods/bece7d1b-b5d8-4762-b0b8-b2752c422776/volumes" Mar 20 16:06:50 crc kubenswrapper[4708]: I0320 16:06:50.122299 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca697a79-760d-4ae1-827f-bc2b0aee1785" path="/var/lib/kubelet/pods/ca697a79-760d-4ae1-827f-bc2b0aee1785/volumes" Mar 20 16:06:56 crc kubenswrapper[4708]: I0320 16:06:56.567385 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 16:07:01 crc kubenswrapper[4708]: I0320 16:07:01.514761 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cbf8bb65-fl828"] Mar 20 16:07:01 crc kubenswrapper[4708]: I0320 16:07:01.516402 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" podUID="d6e9031d-0652-4bef-9e25-bafa28c521d2" containerName="controller-manager" containerID="cri-o://bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9" gracePeriod=30 Mar 20 16:07:01 crc kubenswrapper[4708]: I0320 16:07:01.537778 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x"] Mar 20 16:07:01 crc kubenswrapper[4708]: I0320 16:07:01.538055 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" podUID="9f0a8d59-e0e0-448e-9563-13bc99b1bf36" containerName="route-controller-manager" containerID="cri-o://f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e" gracePeriod=30 Mar 20 16:07:01 crc kubenswrapper[4708]: I0320 16:07:01.998914 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.038274 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-proxy-ca-bundles\") pod \"d6e9031d-0652-4bef-9e25-bafa28c521d2\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.038338 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e9031d-0652-4bef-9e25-bafa28c521d2-serving-cert\") pod \"d6e9031d-0652-4bef-9e25-bafa28c521d2\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.038385 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-client-ca\") pod \"d6e9031d-0652-4bef-9e25-bafa28c521d2\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.038454 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-config\") pod \"d6e9031d-0652-4bef-9e25-bafa28c521d2\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.038493 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlxlv\" (UniqueName: \"kubernetes.io/projected/d6e9031d-0652-4bef-9e25-bafa28c521d2-kube-api-access-dlxlv\") pod \"d6e9031d-0652-4bef-9e25-bafa28c521d2\" (UID: \"d6e9031d-0652-4bef-9e25-bafa28c521d2\") " Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.039903 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-client-ca" (OuterVolumeSpecName: "client-ca") pod "d6e9031d-0652-4bef-9e25-bafa28c521d2" (UID: "d6e9031d-0652-4bef-9e25-bafa28c521d2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.039996 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-config" (OuterVolumeSpecName: "config") pod "d6e9031d-0652-4bef-9e25-bafa28c521d2" (UID: "d6e9031d-0652-4bef-9e25-bafa28c521d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.040050 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d6e9031d-0652-4bef-9e25-bafa28c521d2" (UID: "d6e9031d-0652-4bef-9e25-bafa28c521d2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.045564 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e9031d-0652-4bef-9e25-bafa28c521d2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d6e9031d-0652-4bef-9e25-bafa28c521d2" (UID: "d6e9031d-0652-4bef-9e25-bafa28c521d2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.046844 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e9031d-0652-4bef-9e25-bafa28c521d2-kube-api-access-dlxlv" (OuterVolumeSpecName: "kube-api-access-dlxlv") pod "d6e9031d-0652-4bef-9e25-bafa28c521d2" (UID: "d6e9031d-0652-4bef-9e25-bafa28c521d2"). InnerVolumeSpecName "kube-api-access-dlxlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.087488 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.140234 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q87b\" (UniqueName: \"kubernetes.io/projected/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-kube-api-access-9q87b\") pod \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.140348 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-serving-cert\") pod \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.140414 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-config\") pod \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.140443 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-client-ca\") pod \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\" (UID: \"9f0a8d59-e0e0-448e-9563-13bc99b1bf36\") " Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.140717 4708 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.140745 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e9031d-0652-4bef-9e25-bafa28c521d2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.140757 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.140769 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e9031d-0652-4bef-9e25-bafa28c521d2-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.140782 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlxlv\" (UniqueName: \"kubernetes.io/projected/d6e9031d-0652-4bef-9e25-bafa28c521d2-kube-api-access-dlxlv\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.141954 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-client-ca" (OuterVolumeSpecName: "client-ca") pod "9f0a8d59-e0e0-448e-9563-13bc99b1bf36" (UID: "9f0a8d59-e0e0-448e-9563-13bc99b1bf36"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.145264 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-config" (OuterVolumeSpecName: "config") pod "9f0a8d59-e0e0-448e-9563-13bc99b1bf36" (UID: "9f0a8d59-e0e0-448e-9563-13bc99b1bf36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.145781 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-kube-api-access-9q87b" (OuterVolumeSpecName: "kube-api-access-9q87b") pod "9f0a8d59-e0e0-448e-9563-13bc99b1bf36" (UID: "9f0a8d59-e0e0-448e-9563-13bc99b1bf36"). InnerVolumeSpecName "kube-api-access-9q87b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.147634 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9f0a8d59-e0e0-448e-9563-13bc99b1bf36" (UID: "9f0a8d59-e0e0-448e-9563-13bc99b1bf36"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.243055 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.243615 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.243869 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q87b\" (UniqueName: \"kubernetes.io/projected/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-kube-api-access-9q87b\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.244062 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0a8d59-e0e0-448e-9563-13bc99b1bf36-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.447475 4708 generic.go:334] "Generic (PLEG): container finished" podID="9f0a8d59-e0e0-448e-9563-13bc99b1bf36" containerID="f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e" exitCode=0 Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.447558 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" event={"ID":"9f0a8d59-e0e0-448e-9563-13bc99b1bf36","Type":"ContainerDied","Data":"f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e"} Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.447592 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" event={"ID":"9f0a8d59-e0e0-448e-9563-13bc99b1bf36","Type":"ContainerDied","Data":"87e5a00da03011d35e27de2f8bf7fede2d53e42d1cd548b25dfb079e8208a495"} Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.447611 4708 scope.go:117] "RemoveContainer" containerID="f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.447624 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.450308 4708 generic.go:334] "Generic (PLEG): container finished" podID="d6e9031d-0652-4bef-9e25-bafa28c521d2" containerID="bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9" exitCode=0 Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.450364 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" event={"ID":"d6e9031d-0652-4bef-9e25-bafa28c521d2","Type":"ContainerDied","Data":"bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9"} Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.450391 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" event={"ID":"d6e9031d-0652-4bef-9e25-bafa28c521d2","Type":"ContainerDied","Data":"c92efb8804a3f108c03dbd8a88597b09ac0792e70db4bf177f19bea6c0b41b72"} Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.450445 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cbf8bb65-fl828" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.470567 4708 scope.go:117] "RemoveContainer" containerID="f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e" Mar 20 16:07:02 crc kubenswrapper[4708]: E0320 16:07:02.471497 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e\": container with ID starting with f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e not found: ID does not exist" containerID="f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.471874 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e"} err="failed to get container status \"f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e\": rpc error: code = NotFound desc = could not find container \"f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e\": container with ID starting with f6d9cc353ea44fad88ebd1ca8244f217712c93f0bd7f142c5c1de19d0107d53e not found: ID does not exist" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.472159 4708 scope.go:117] "RemoveContainer" containerID="bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.476068 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cbf8bb65-fl828"] Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.482090 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7cbf8bb65-fl828"] Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.494121 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x"] Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.497994 4708 scope.go:117] "RemoveContainer" containerID="bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9" Mar 20 16:07:02 crc kubenswrapper[4708]: E0320 16:07:02.498876 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9\": container with ID starting with bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9 not found: ID does not exist" containerID="bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.498962 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9"} err="failed to get container status \"bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9\": rpc error: code = NotFound desc = could not find container \"bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9\": container with ID starting with bc3a2d77ae5e13d9d365f59b2fec3e0ce0c006fd31e3aa36942cb975c1ea0ed9 not found: ID does not exist" Mar 20 16:07:02 crc kubenswrapper[4708]: I0320 16:07:02.502706 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-jxd6x"] Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.038571 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59b44b9f79-bg92t"] Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039485 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" containerName="registry-server" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039532 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" containerName="registry-server" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039551 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" containerName="extract-utilities" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039563 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" containerName="extract-utilities" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039576 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca697a79-760d-4ae1-827f-bc2b0aee1785" containerName="extract-utilities" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039588 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca697a79-760d-4ae1-827f-bc2b0aee1785" containerName="extract-utilities" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039618 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca697a79-760d-4ae1-827f-bc2b0aee1785" containerName="registry-server" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039631 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca697a79-760d-4ae1-827f-bc2b0aee1785" containerName="registry-server" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039654 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" containerName="extract-content" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039694 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" containerName="extract-content" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039722 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca697a79-760d-4ae1-827f-bc2b0aee1785" containerName="extract-content" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039737 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca697a79-760d-4ae1-827f-bc2b0aee1785" containerName="extract-content" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039771 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bece7d1b-b5d8-4762-b0b8-b2752c422776" containerName="registry-server" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039783 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="bece7d1b-b5d8-4762-b0b8-b2752c422776" containerName="registry-server" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039806 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4437a79c-97de-40ff-a2fa-d29cc2f86828" containerName="marketplace-operator" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039818 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="4437a79c-97de-40ff-a2fa-d29cc2f86828" containerName="marketplace-operator" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039841 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e9031d-0652-4bef-9e25-bafa28c521d2" containerName="controller-manager" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039852 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e9031d-0652-4bef-9e25-bafa28c521d2" containerName="controller-manager" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039879 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65dd12f7-e1af-4213-b264-d846c01eaba8" containerName="registry-server" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039889 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="65dd12f7-e1af-4213-b264-d846c01eaba8" containerName="registry-server" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039902 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0a8d59-e0e0-448e-9563-13bc99b1bf36" containerName="route-controller-manager" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039914 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0a8d59-e0e0-448e-9563-13bc99b1bf36" containerName="route-controller-manager" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039936 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bece7d1b-b5d8-4762-b0b8-b2752c422776" containerName="extract-utilities" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039949 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="bece7d1b-b5d8-4762-b0b8-b2752c422776" containerName="extract-utilities" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.039962 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65dd12f7-e1af-4213-b264-d846c01eaba8" containerName="extract-utilities" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.039974 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="65dd12f7-e1af-4213-b264-d846c01eaba8" containerName="extract-utilities" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.040000 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bece7d1b-b5d8-4762-b0b8-b2752c422776" containerName="extract-content" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.040013 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="bece7d1b-b5d8-4762-b0b8-b2752c422776" containerName="extract-content" Mar 20 16:07:03 crc kubenswrapper[4708]: E0320 16:07:03.040034 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65dd12f7-e1af-4213-b264-d846c01eaba8" containerName="extract-content" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.040049 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="65dd12f7-e1af-4213-b264-d846c01eaba8" containerName="extract-content" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.040465 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca697a79-760d-4ae1-827f-bc2b0aee1785" containerName="registry-server" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.040498 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="4437a79c-97de-40ff-a2fa-d29cc2f86828" containerName="marketplace-operator" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.040525 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="bece7d1b-b5d8-4762-b0b8-b2752c422776" containerName="registry-server" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.040544 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e9031d-0652-4bef-9e25-bafa28c521d2" containerName="controller-manager" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.040565 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0a8d59-e0e0-448e-9563-13bc99b1bf36" containerName="route-controller-manager" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.040581 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="65dd12f7-e1af-4213-b264-d846c01eaba8" containerName="registry-server" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.040606 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d8b3c28-42d0-479b-b45d-0fe00b8cb36d" containerName="registry-server" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.041702 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.044562 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk"] Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.045597 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.062420 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.062820 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.063179 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.063593 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.063934 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.064853 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.065019 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.065137 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.065460 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.065616 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.065786 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.065998 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.067854 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59b44b9f79-bg92t"] Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.069493 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.082320 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk"] Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.158783 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-client-ca\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.158849 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-proxy-ca-bundles\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.158907 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02d4142b-e300-4057-a69e-2c750ce861e6-config\") pod \"route-controller-manager-5d9f786696-rptvk\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.158958 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b3343a9-ff80-46c0-874a-53f4866bf8fb-serving-cert\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.158990 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02d4142b-e300-4057-a69e-2c750ce861e6-client-ca\") pod \"route-controller-manager-5d9f786696-rptvk\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.159020 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02d4142b-e300-4057-a69e-2c750ce861e6-serving-cert\") pod \"route-controller-manager-5d9f786696-rptvk\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.159069 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v68tn\" (UniqueName: \"kubernetes.io/projected/2b3343a9-ff80-46c0-874a-53f4866bf8fb-kube-api-access-v68tn\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.159116 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgx7h\" (UniqueName: \"kubernetes.io/projected/02d4142b-e300-4057-a69e-2c750ce861e6-kube-api-access-jgx7h\") pod \"route-controller-manager-5d9f786696-rptvk\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.159154 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-config\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.260125 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgx7h\" (UniqueName: \"kubernetes.io/projected/02d4142b-e300-4057-a69e-2c750ce861e6-kube-api-access-jgx7h\") pod \"route-controller-manager-5d9f786696-rptvk\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.260200 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-config\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.260264 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-client-ca\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.260290 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-proxy-ca-bundles\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.260321 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02d4142b-e300-4057-a69e-2c750ce861e6-config\") pod \"route-controller-manager-5d9f786696-rptvk\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.260368 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b3343a9-ff80-46c0-874a-53f4866bf8fb-serving-cert\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.260397 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02d4142b-e300-4057-a69e-2c750ce861e6-client-ca\") pod \"route-controller-manager-5d9f786696-rptvk\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.260424 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02d4142b-e300-4057-a69e-2c750ce861e6-serving-cert\") pod \"route-controller-manager-5d9f786696-rptvk\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.260476 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v68tn\" (UniqueName: \"kubernetes.io/projected/2b3343a9-ff80-46c0-874a-53f4866bf8fb-kube-api-access-v68tn\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.262430 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02d4142b-e300-4057-a69e-2c750ce861e6-client-ca\") pod \"route-controller-manager-5d9f786696-rptvk\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.262691 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-client-ca\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.262710 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-proxy-ca-bundles\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.262841 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02d4142b-e300-4057-a69e-2c750ce861e6-config\") pod \"route-controller-manager-5d9f786696-rptvk\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.265656 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-config\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.268378 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b3343a9-ff80-46c0-874a-53f4866bf8fb-serving-cert\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.269389 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02d4142b-e300-4057-a69e-2c750ce861e6-serving-cert\") pod \"route-controller-manager-5d9f786696-rptvk\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.283014 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgx7h\" (UniqueName: \"kubernetes.io/projected/02d4142b-e300-4057-a69e-2c750ce861e6-kube-api-access-jgx7h\") pod \"route-controller-manager-5d9f786696-rptvk\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.284809 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v68tn\" (UniqueName: \"kubernetes.io/projected/2b3343a9-ff80-46c0-874a-53f4866bf8fb-kube-api-access-v68tn\") pod \"controller-manager-59b44b9f79-bg92t\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.384062 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:03 crc kubenswrapper[4708]: I0320 16:07:03.388927 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:04 crc kubenswrapper[4708]: I0320 16:07:04.118467 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0a8d59-e0e0-448e-9563-13bc99b1bf36" path="/var/lib/kubelet/pods/9f0a8d59-e0e0-448e-9563-13bc99b1bf36/volumes" Mar 20 16:07:04 crc kubenswrapper[4708]: I0320 16:07:04.119589 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e9031d-0652-4bef-9e25-bafa28c521d2" path="/var/lib/kubelet/pods/d6e9031d-0652-4bef-9e25-bafa28c521d2/volumes" Mar 20 16:07:04 crc kubenswrapper[4708]: I0320 16:07:04.504998 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk"] Mar 20 16:07:04 crc kubenswrapper[4708]: I0320 16:07:04.512809 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59b44b9f79-bg92t"] Mar 20 16:07:05 crc kubenswrapper[4708]: I0320 16:07:05.483351 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" event={"ID":"2b3343a9-ff80-46c0-874a-53f4866bf8fb","Type":"ContainerStarted","Data":"bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb"} Mar 20 16:07:05 crc kubenswrapper[4708]: I0320 16:07:05.483967 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" event={"ID":"2b3343a9-ff80-46c0-874a-53f4866bf8fb","Type":"ContainerStarted","Data":"0ee8105c7f8d5a4ab69a847f4d09a32641b11bfc6c1caf0324c5ae99b7818d94"} Mar 20 16:07:05 crc kubenswrapper[4708]: I0320 16:07:05.483998 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:05 crc kubenswrapper[4708]: I0320 16:07:05.485117 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" event={"ID":"02d4142b-e300-4057-a69e-2c750ce861e6","Type":"ContainerStarted","Data":"3d6af093a99ef88bbde28175dfd7195533c85abc323c4fd6b84a33a2dcab32bb"} Mar 20 16:07:05 crc kubenswrapper[4708]: I0320 16:07:05.485168 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" event={"ID":"02d4142b-e300-4057-a69e-2c750ce861e6","Type":"ContainerStarted","Data":"75e3f5a4e460e18040479086d3bdddafde1ce3b81802142c2863348a1b6039b4"} Mar 20 16:07:05 crc kubenswrapper[4708]: I0320 16:07:05.485397 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:05 crc kubenswrapper[4708]: I0320 16:07:05.489573 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:05 crc kubenswrapper[4708]: I0320 16:07:05.492089 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:05 crc kubenswrapper[4708]: I0320 16:07:05.528763 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" podStartSLOduration=4.528742654 podStartE2EDuration="4.528742654s" podCreationTimestamp="2026-03-20 16:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:07:05.525606056 +0000 UTC m=+380.199942781" watchObservedRunningTime="2026-03-20 16:07:05.528742654 +0000 UTC m=+380.203079369" Mar 20 16:07:05 crc kubenswrapper[4708]: I0320 16:07:05.529200 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" podStartSLOduration=4.529196327 podStartE2EDuration="4.529196327s" podCreationTimestamp="2026-03-20 16:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:07:05.507498614 +0000 UTC m=+380.181835329" watchObservedRunningTime="2026-03-20 16:07:05.529196327 +0000 UTC m=+380.203533042" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.068664 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9jd6r"] Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.070057 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.095248 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9jd6r"] Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.213470 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b413af87-dd3b-4608-a506-46859daf6103-registry-tls\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.213554 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgwbd\" (UniqueName: \"kubernetes.io/projected/b413af87-dd3b-4608-a506-46859daf6103-kube-api-access-kgwbd\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.213582 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b413af87-dd3b-4608-a506-46859daf6103-bound-sa-token\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.214005 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.214094 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b413af87-dd3b-4608-a506-46859daf6103-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.214135 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b413af87-dd3b-4608-a506-46859daf6103-trusted-ca\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.214183 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b413af87-dd3b-4608-a506-46859daf6103-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.214213 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b413af87-dd3b-4608-a506-46859daf6103-registry-certificates\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.251049 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.316016 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b413af87-dd3b-4608-a506-46859daf6103-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.316082 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b413af87-dd3b-4608-a506-46859daf6103-trusted-ca\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.316122 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b413af87-dd3b-4608-a506-46859daf6103-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.316146 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b413af87-dd3b-4608-a506-46859daf6103-registry-certificates\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.316175 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b413af87-dd3b-4608-a506-46859daf6103-registry-tls\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.316200 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgwbd\" (UniqueName: \"kubernetes.io/projected/b413af87-dd3b-4608-a506-46859daf6103-kube-api-access-kgwbd\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.316219 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b413af87-dd3b-4608-a506-46859daf6103-bound-sa-token\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.316700 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b413af87-dd3b-4608-a506-46859daf6103-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.317832 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b413af87-dd3b-4608-a506-46859daf6103-trusted-ca\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.318070 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b413af87-dd3b-4608-a506-46859daf6103-registry-certificates\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.324915 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b413af87-dd3b-4608-a506-46859daf6103-registry-tls\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.324972 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b413af87-dd3b-4608-a506-46859daf6103-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.338490 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgwbd\" (UniqueName: \"kubernetes.io/projected/b413af87-dd3b-4608-a506-46859daf6103-kube-api-access-kgwbd\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.343960 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b413af87-dd3b-4608-a506-46859daf6103-bound-sa-token\") pod \"image-registry-66df7c8f76-9jd6r\" (UID: \"b413af87-dd3b-4608-a506-46859daf6103\") " pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.390535 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:07 crc kubenswrapper[4708]: I0320 16:07:07.823204 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9jd6r"] Mar 20 16:07:08 crc kubenswrapper[4708]: I0320 16:07:08.489376 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpk69"] Mar 20 16:07:08 crc kubenswrapper[4708]: I0320 16:07:08.519498 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" event={"ID":"b413af87-dd3b-4608-a506-46859daf6103","Type":"ContainerStarted","Data":"004c13b77d8ca70ffa6448a0e5954d2fe6f4b35c6c1ece2586d60bab548631ac"} Mar 20 16:07:08 crc kubenswrapper[4708]: I0320 16:07:08.520139 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:08 crc kubenswrapper[4708]: I0320 16:07:08.520160 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" event={"ID":"b413af87-dd3b-4608-a506-46859daf6103","Type":"ContainerStarted","Data":"18c239b1b0dfa569b4884cb0fd629dd932b18ad55d39179d342e458ec7d8da3d"} Mar 20 16:07:08 crc kubenswrapper[4708]: I0320 16:07:08.562494 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" podStartSLOduration=1.562471941 podStartE2EDuration="1.562471941s" podCreationTimestamp="2026-03-20 16:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:07:08.559535498 +0000 UTC m=+383.233872243" watchObservedRunningTime="2026-03-20 16:07:08.562471941 +0000 UTC m=+383.236808656" Mar 20 16:07:21 crc kubenswrapper[4708]: I0320 16:07:21.522821 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59b44b9f79-bg92t"] Mar 20 16:07:21 crc kubenswrapper[4708]: I0320 16:07:21.523931 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" podUID="2b3343a9-ff80-46c0-874a-53f4866bf8fb" containerName="controller-manager" containerID="cri-o://bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb" gracePeriod=30 Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.119709 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.233779 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-proxy-ca-bundles\") pod \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.233871 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-config\") pod \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.233939 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v68tn\" (UniqueName: \"kubernetes.io/projected/2b3343a9-ff80-46c0-874a-53f4866bf8fb-kube-api-access-v68tn\") pod \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.234034 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b3343a9-ff80-46c0-874a-53f4866bf8fb-serving-cert\") pod \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.234073 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-client-ca\") pod \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\" (UID: \"2b3343a9-ff80-46c0-874a-53f4866bf8fb\") " Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.235585 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2b3343a9-ff80-46c0-874a-53f4866bf8fb" (UID: "2b3343a9-ff80-46c0-874a-53f4866bf8fb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.235706 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-config" (OuterVolumeSpecName: "config") pod "2b3343a9-ff80-46c0-874a-53f4866bf8fb" (UID: "2b3343a9-ff80-46c0-874a-53f4866bf8fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.236030 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-client-ca" (OuterVolumeSpecName: "client-ca") pod "2b3343a9-ff80-46c0-874a-53f4866bf8fb" (UID: "2b3343a9-ff80-46c0-874a-53f4866bf8fb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.243947 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b3343a9-ff80-46c0-874a-53f4866bf8fb-kube-api-access-v68tn" (OuterVolumeSpecName: "kube-api-access-v68tn") pod "2b3343a9-ff80-46c0-874a-53f4866bf8fb" (UID: "2b3343a9-ff80-46c0-874a-53f4866bf8fb"). InnerVolumeSpecName "kube-api-access-v68tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.244762 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b3343a9-ff80-46c0-874a-53f4866bf8fb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2b3343a9-ff80-46c0-874a-53f4866bf8fb" (UID: "2b3343a9-ff80-46c0-874a-53f4866bf8fb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.335476 4708 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.335519 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.335529 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v68tn\" (UniqueName: \"kubernetes.io/projected/2b3343a9-ff80-46c0-874a-53f4866bf8fb-kube-api-access-v68tn\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.335553 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b3343a9-ff80-46c0-874a-53f4866bf8fb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.335563 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b3343a9-ff80-46c0-874a-53f4866bf8fb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.643972 4708 generic.go:334] "Generic (PLEG): container finished" podID="2b3343a9-ff80-46c0-874a-53f4866bf8fb" containerID="bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb" exitCode=0 Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.644091 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.644070 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" event={"ID":"2b3343a9-ff80-46c0-874a-53f4866bf8fb","Type":"ContainerDied","Data":"bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb"} Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.644343 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59b44b9f79-bg92t" event={"ID":"2b3343a9-ff80-46c0-874a-53f4866bf8fb","Type":"ContainerDied","Data":"0ee8105c7f8d5a4ab69a847f4d09a32641b11bfc6c1caf0324c5ae99b7818d94"} Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.644382 4708 scope.go:117] "RemoveContainer" containerID="bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.679277 4708 scope.go:117] "RemoveContainer" containerID="bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb" Mar 20 16:07:22 crc kubenswrapper[4708]: E0320 16:07:22.680344 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb\": container with ID starting with bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb not found: ID does not exist" containerID="bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.680392 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb"} err="failed to get container status \"bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb\": rpc error: code = NotFound desc = could not find container \"bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb\": container with ID starting with bb7b15a8c123c384db3472ee8c3615db15e34421e5bba092080076234a7d12bb not found: ID does not exist" Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.690821 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59b44b9f79-bg92t"] Mar 20 16:07:22 crc kubenswrapper[4708]: I0320 16:07:22.695618 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-59b44b9f79-bg92t"] Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.043906 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx"] Mar 20 16:07:23 crc kubenswrapper[4708]: E0320 16:07:23.044140 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3343a9-ff80-46c0-874a-53f4866bf8fb" containerName="controller-manager" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.044157 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3343a9-ff80-46c0-874a-53f4866bf8fb" containerName="controller-manager" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.044253 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b3343a9-ff80-46c0-874a-53f4866bf8fb" containerName="controller-manager" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.044614 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.047071 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.047188 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.047398 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.047415 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.047550 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.047887 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.054643 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx"] Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.055094 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.145151 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-proxy-ca-bundles\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.145244 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-config\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.145271 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-serving-cert\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.145304 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-client-ca\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.145328 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx5kp\" (UniqueName: \"kubernetes.io/projected/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-kube-api-access-kx5kp\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.247092 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-proxy-ca-bundles\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.247154 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-config\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.247183 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-serving-cert\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.247209 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-client-ca\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.247227 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx5kp\" (UniqueName: \"kubernetes.io/projected/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-kube-api-access-kx5kp\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.248604 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-proxy-ca-bundles\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.250631 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-client-ca\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.250648 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-config\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.256822 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-serving-cert\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.263296 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx5kp\" (UniqueName: \"kubernetes.io/projected/4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb-kube-api-access-kx5kp\") pod \"controller-manager-7cbf8bb65-ck7lx\" (UID: \"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb\") " pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.381683 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:23 crc kubenswrapper[4708]: I0320 16:07:23.855554 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx"] Mar 20 16:07:23 crc kubenswrapper[4708]: W0320 16:07:23.862602 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b4bae3c_c7c2_44d4_9db4_c30a42d2c7eb.slice/crio-19c585bb4faf69021dff009c6959648dbc051cd7cda5dee52706447c29659f40 WatchSource:0}: Error finding container 19c585bb4faf69021dff009c6959648dbc051cd7cda5dee52706447c29659f40: Status 404 returned error can't find the container with id 19c585bb4faf69021dff009c6959648dbc051cd7cda5dee52706447c29659f40 Mar 20 16:07:24 crc kubenswrapper[4708]: I0320 16:07:24.124057 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b3343a9-ff80-46c0-874a-53f4866bf8fb" path="/var/lib/kubelet/pods/2b3343a9-ff80-46c0-874a-53f4866bf8fb/volumes" Mar 20 16:07:24 crc kubenswrapper[4708]: I0320 16:07:24.660649 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" event={"ID":"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb","Type":"ContainerStarted","Data":"19c585bb4faf69021dff009c6959648dbc051cd7cda5dee52706447c29659f40"} Mar 20 16:07:27 crc kubenswrapper[4708]: I0320 16:07:27.396417 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-9jd6r" Mar 20 16:07:27 crc kubenswrapper[4708]: I0320 16:07:27.465005 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tkdfr"] Mar 20 16:07:27 crc kubenswrapper[4708]: I0320 16:07:27.692730 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" event={"ID":"4b4bae3c-c7c2-44d4-9db4-c30a42d2c7eb","Type":"ContainerStarted","Data":"03aaa58781f160634b87923fdc3ab028ceb83c9103926280c9c1c50bbbf0e051"} Mar 20 16:07:27 crc kubenswrapper[4708]: I0320 16:07:27.693252 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:27 crc kubenswrapper[4708]: I0320 16:07:27.700882 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" Mar 20 16:07:27 crc kubenswrapper[4708]: I0320 16:07:27.717532 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cbf8bb65-ck7lx" podStartSLOduration=6.717503801 podStartE2EDuration="6.717503801s" podCreationTimestamp="2026-03-20 16:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:07:27.714261291 +0000 UTC m=+402.388598006" watchObservedRunningTime="2026-03-20 16:07:27.717503801 +0000 UTC m=+402.391840516" Mar 20 16:07:33 crc kubenswrapper[4708]: I0320 16:07:33.531475 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" podUID="68a687d9-a448-4a5c-b7b9-e4510468b3c9" containerName="oauth-openshift" containerID="cri-o://abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9" gracePeriod=15 Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.713145 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.740928 4708 generic.go:334] "Generic (PLEG): container finished" podID="68a687d9-a448-4a5c-b7b9-e4510468b3c9" containerID="abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9" exitCode=0 Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.741004 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.741005 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" event={"ID":"68a687d9-a448-4a5c-b7b9-e4510468b3c9","Type":"ContainerDied","Data":"abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9"} Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.741115 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hpk69" event={"ID":"68a687d9-a448-4a5c-b7b9-e4510468b3c9","Type":"ContainerDied","Data":"13cd6ad7efc2f4fc6c79fc0cc717c27d0eb3dc0f221b73e60c03555ebeb9c6cf"} Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.741138 4708 scope.go:117] "RemoveContainer" containerID="abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.765287 4708 scope.go:117] "RemoveContainer" containerID="abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9" Mar 20 16:07:34 crc kubenswrapper[4708]: E0320 16:07:34.766017 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9\": container with ID starting with abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9 not found: ID does not exist" containerID="abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.766093 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9"} err="failed to get container status \"abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9\": rpc error: code = NotFound desc = could not find container \"abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9\": container with ID starting with abe7241338adcd704a0571a5e723ca93cb61299ca50d1f792e8930b22e26fbf9 not found: ID does not exist" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.768544 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-75df47bf4c-npmwf"] Mar 20 16:07:34 crc kubenswrapper[4708]: E0320 16:07:34.768885 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a687d9-a448-4a5c-b7b9-e4510468b3c9" containerName="oauth-openshift" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.768906 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a687d9-a448-4a5c-b7b9-e4510468b3c9" containerName="oauth-openshift" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.769046 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a687d9-a448-4a5c-b7b9-e4510468b3c9" containerName="oauth-openshift" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.769563 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.780827 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75df47bf4c-npmwf"] Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.856251 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-login\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.856837 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68a687d9-a448-4a5c-b7b9-e4510468b3c9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.856302 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68a687d9-a448-4a5c-b7b9-e4510468b3c9-audit-dir\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.858336 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-provider-selection\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.858368 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-cliconfig\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.858526 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-serving-cert\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.858603 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-router-certs\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.858844 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-session\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.858892 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-service-ca\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.858932 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-idp-0-file-data\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.858971 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-ocp-branding-template\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.859041 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkpnj\" (UniqueName: \"kubernetes.io/projected/68a687d9-a448-4a5c-b7b9-e4510468b3c9-kube-api-access-gkpnj\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.859061 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-audit-policies\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.859092 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-error\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.859417 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.860399 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.860965 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861063 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-trusted-ca-bundle\") pod \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\" (UID: \"68a687d9-a448-4a5c-b7b9-e4510468b3c9\") " Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861243 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2c62634-e50d-4d5b-96d1-0548caeda176-audit-policies\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861288 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861320 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861349 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-service-ca\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861420 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-session\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861443 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861540 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861605 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck7pd\" (UniqueName: \"kubernetes.io/projected/b2c62634-e50d-4d5b-96d1-0548caeda176-kube-api-access-ck7pd\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861661 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-user-template-login\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861769 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861839 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2c62634-e50d-4d5b-96d1-0548caeda176-audit-dir\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861865 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-router-certs\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861921 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.861993 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.862023 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-user-template-error\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.862117 4708 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.862133 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.862150 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.862165 4708 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68a687d9-a448-4a5c-b7b9-e4510468b3c9-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.862182 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.874336 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.877055 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.879166 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a687d9-a448-4a5c-b7b9-e4510468b3c9-kube-api-access-gkpnj" (OuterVolumeSpecName: "kube-api-access-gkpnj") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "kube-api-access-gkpnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.879841 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.879955 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.880756 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.880955 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.888309 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.888815 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "68a687d9-a448-4a5c-b7b9-e4510468b3c9" (UID: "68a687d9-a448-4a5c-b7b9-e4510468b3c9"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.964444 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2c62634-e50d-4d5b-96d1-0548caeda176-audit-policies\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.964761 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.964879 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.964967 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-service-ca\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.965595 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-session\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.965748 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.965856 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck7pd\" (UniqueName: \"kubernetes.io/projected/b2c62634-e50d-4d5b-96d1-0548caeda176-kube-api-access-ck7pd\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.965960 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-user-template-login\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.966075 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.966189 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2c62634-e50d-4d5b-96d1-0548caeda176-audit-dir\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.965418 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2c62634-e50d-4d5b-96d1-0548caeda176-audit-policies\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.965701 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.965988 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-service-ca\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.966276 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2c62634-e50d-4d5b-96d1-0548caeda176-audit-dir\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.967436 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-router-certs\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.967561 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.967696 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.967788 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-user-template-error\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.967947 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.968032 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.968144 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.968222 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.968296 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.968375 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.968450 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkpnj\" (UniqueName: \"kubernetes.io/projected/68a687d9-a448-4a5c-b7b9-e4510468b3c9-kube-api-access-gkpnj\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.968534 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.968656 4708 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/68a687d9-a448-4a5c-b7b9-e4510468b3c9-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.968561 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.969515 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.970415 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-user-template-login\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.970429 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-router-certs\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.970816 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.970977 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-user-template-error\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.971573 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.972636 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.975560 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b2c62634-e50d-4d5b-96d1-0548caeda176-v4-0-config-system-session\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:34 crc kubenswrapper[4708]: I0320 16:07:34.986010 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck7pd\" (UniqueName: \"kubernetes.io/projected/b2c62634-e50d-4d5b-96d1-0548caeda176-kube-api-access-ck7pd\") pod \"oauth-openshift-75df47bf4c-npmwf\" (UID: \"b2c62634-e50d-4d5b-96d1-0548caeda176\") " pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:35 crc kubenswrapper[4708]: I0320 16:07:35.071563 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpk69"] Mar 20 16:07:35 crc kubenswrapper[4708]: I0320 16:07:35.074572 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpk69"] Mar 20 16:07:35 crc kubenswrapper[4708]: I0320 16:07:35.102955 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:35 crc kubenswrapper[4708]: I0320 16:07:35.518519 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75df47bf4c-npmwf"] Mar 20 16:07:35 crc kubenswrapper[4708]: I0320 16:07:35.747706 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" event={"ID":"b2c62634-e50d-4d5b-96d1-0548caeda176","Type":"ContainerStarted","Data":"84eb770772abda47ea61124582f238e5a252f5a42d4a2a657a3c4799aec05099"} Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.049403 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mckxb"] Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.050493 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.054071 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.063544 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mckxb"] Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.088403 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33dad410-3bf5-4f44-a5d8-440d08d47ff3-catalog-content\") pod \"redhat-operators-mckxb\" (UID: \"33dad410-3bf5-4f44-a5d8-440d08d47ff3\") " pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.088505 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p6zc\" (UniqueName: \"kubernetes.io/projected/33dad410-3bf5-4f44-a5d8-440d08d47ff3-kube-api-access-6p6zc\") pod \"redhat-operators-mckxb\" (UID: \"33dad410-3bf5-4f44-a5d8-440d08d47ff3\") " pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.088527 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33dad410-3bf5-4f44-a5d8-440d08d47ff3-utilities\") pod \"redhat-operators-mckxb\" (UID: \"33dad410-3bf5-4f44-a5d8-440d08d47ff3\") " pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.119446 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68a687d9-a448-4a5c-b7b9-e4510468b3c9" path="/var/lib/kubelet/pods/68a687d9-a448-4a5c-b7b9-e4510468b3c9/volumes" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.190398 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p6zc\" (UniqueName: \"kubernetes.io/projected/33dad410-3bf5-4f44-a5d8-440d08d47ff3-kube-api-access-6p6zc\") pod \"redhat-operators-mckxb\" (UID: \"33dad410-3bf5-4f44-a5d8-440d08d47ff3\") " pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.190732 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33dad410-3bf5-4f44-a5d8-440d08d47ff3-utilities\") pod \"redhat-operators-mckxb\" (UID: \"33dad410-3bf5-4f44-a5d8-440d08d47ff3\") " pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.190857 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33dad410-3bf5-4f44-a5d8-440d08d47ff3-catalog-content\") pod \"redhat-operators-mckxb\" (UID: \"33dad410-3bf5-4f44-a5d8-440d08d47ff3\") " pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.191385 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33dad410-3bf5-4f44-a5d8-440d08d47ff3-catalog-content\") pod \"redhat-operators-mckxb\" (UID: \"33dad410-3bf5-4f44-a5d8-440d08d47ff3\") " pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.191395 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33dad410-3bf5-4f44-a5d8-440d08d47ff3-utilities\") pod \"redhat-operators-mckxb\" (UID: \"33dad410-3bf5-4f44-a5d8-440d08d47ff3\") " pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.208773 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p6zc\" (UniqueName: \"kubernetes.io/projected/33dad410-3bf5-4f44-a5d8-440d08d47ff3-kube-api-access-6p6zc\") pod \"redhat-operators-mckxb\" (UID: \"33dad410-3bf5-4f44-a5d8-440d08d47ff3\") " pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.258301 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jcp6r"] Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.262588 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.267477 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.271650 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jcp6r"] Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.291974 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqpnb\" (UniqueName: \"kubernetes.io/projected/ee83f4f8-cabc-48ce-a0f2-a5c047a43d85-kube-api-access-wqpnb\") pod \"redhat-marketplace-jcp6r\" (UID: \"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85\") " pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.292041 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee83f4f8-cabc-48ce-a0f2-a5c047a43d85-catalog-content\") pod \"redhat-marketplace-jcp6r\" (UID: \"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85\") " pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.292177 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee83f4f8-cabc-48ce-a0f2-a5c047a43d85-utilities\") pod \"redhat-marketplace-jcp6r\" (UID: \"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85\") " pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.365621 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.393923 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqpnb\" (UniqueName: \"kubernetes.io/projected/ee83f4f8-cabc-48ce-a0f2-a5c047a43d85-kube-api-access-wqpnb\") pod \"redhat-marketplace-jcp6r\" (UID: \"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85\") " pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.394006 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee83f4f8-cabc-48ce-a0f2-a5c047a43d85-catalog-content\") pod \"redhat-marketplace-jcp6r\" (UID: \"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85\") " pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.394055 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee83f4f8-cabc-48ce-a0f2-a5c047a43d85-utilities\") pod \"redhat-marketplace-jcp6r\" (UID: \"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85\") " pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.394869 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee83f4f8-cabc-48ce-a0f2-a5c047a43d85-utilities\") pod \"redhat-marketplace-jcp6r\" (UID: \"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85\") " pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.395233 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee83f4f8-cabc-48ce-a0f2-a5c047a43d85-catalog-content\") pod \"redhat-marketplace-jcp6r\" (UID: \"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85\") " pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.420046 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqpnb\" (UniqueName: \"kubernetes.io/projected/ee83f4f8-cabc-48ce-a0f2-a5c047a43d85-kube-api-access-wqpnb\") pod \"redhat-marketplace-jcp6r\" (UID: \"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85\") " pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.630000 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.760094 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mckxb"] Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.766391 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" event={"ID":"b2c62634-e50d-4d5b-96d1-0548caeda176","Type":"ContainerStarted","Data":"2f25637077c2b3ccaa3a10e7e8f8d9deb29b05c2327cdff8444307e2530316a5"} Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.766843 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.771049 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" Mar 20 16:07:36 crc kubenswrapper[4708]: W0320 16:07:36.772457 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33dad410_3bf5_4f44_a5d8_440d08d47ff3.slice/crio-f13f491d493add92e5c2a0b65c3e5e769f66fd02a7dbe525fe48d924c12310ba WatchSource:0}: Error finding container f13f491d493add92e5c2a0b65c3e5e769f66fd02a7dbe525fe48d924c12310ba: Status 404 returned error can't find the container with id f13f491d493add92e5c2a0b65c3e5e769f66fd02a7dbe525fe48d924c12310ba Mar 20 16:07:36 crc kubenswrapper[4708]: I0320 16:07:36.791043 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-75df47bf4c-npmwf" podStartSLOduration=28.791020824 podStartE2EDuration="28.791020824s" podCreationTimestamp="2026-03-20 16:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:07:36.790132989 +0000 UTC m=+411.464469704" watchObservedRunningTime="2026-03-20 16:07:36.791020824 +0000 UTC m=+411.465357539" Mar 20 16:07:37 crc kubenswrapper[4708]: I0320 16:07:37.079564 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jcp6r"] Mar 20 16:07:37 crc kubenswrapper[4708]: I0320 16:07:37.781783 4708 generic.go:334] "Generic (PLEG): container finished" podID="ee83f4f8-cabc-48ce-a0f2-a5c047a43d85" containerID="03a06c850218af89918840288a1c237891adf915e94e3726d33f210d6b87af82" exitCode=0 Mar 20 16:07:37 crc kubenswrapper[4708]: I0320 16:07:37.781886 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcp6r" event={"ID":"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85","Type":"ContainerDied","Data":"03a06c850218af89918840288a1c237891adf915e94e3726d33f210d6b87af82"} Mar 20 16:07:37 crc kubenswrapper[4708]: I0320 16:07:37.782194 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcp6r" event={"ID":"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85","Type":"ContainerStarted","Data":"ebd1d05571c3b741b0d66390e26f46bc01393cdce5879cf6a4d0caeb18eac426"} Mar 20 16:07:37 crc kubenswrapper[4708]: I0320 16:07:37.784021 4708 generic.go:334] "Generic (PLEG): container finished" podID="33dad410-3bf5-4f44-a5d8-440d08d47ff3" containerID="f72e7d9e35fe2352bfd93c2454bd7435bed66027df1bf823c3511984881e8e8f" exitCode=0 Mar 20 16:07:37 crc kubenswrapper[4708]: I0320 16:07:37.784071 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mckxb" event={"ID":"33dad410-3bf5-4f44-a5d8-440d08d47ff3","Type":"ContainerDied","Data":"f72e7d9e35fe2352bfd93c2454bd7435bed66027df1bf823c3511984881e8e8f"} Mar 20 16:07:37 crc kubenswrapper[4708]: I0320 16:07:37.784116 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mckxb" event={"ID":"33dad410-3bf5-4f44-a5d8-440d08d47ff3","Type":"ContainerStarted","Data":"f13f491d493add92e5c2a0b65c3e5e769f66fd02a7dbe525fe48d924c12310ba"} Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.460260 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4pc5d"] Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.475436 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.482119 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pc5d"] Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.491261 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.535272 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrq4b\" (UniqueName: \"kubernetes.io/projected/8da9d79b-8b83-4a0b-bd07-dcb278ef137a-kube-api-access-qrq4b\") pod \"community-operators-4pc5d\" (UID: \"8da9d79b-8b83-4a0b-bd07-dcb278ef137a\") " pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.535432 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8da9d79b-8b83-4a0b-bd07-dcb278ef137a-catalog-content\") pod \"community-operators-4pc5d\" (UID: \"8da9d79b-8b83-4a0b-bd07-dcb278ef137a\") " pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.535524 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8da9d79b-8b83-4a0b-bd07-dcb278ef137a-utilities\") pod \"community-operators-4pc5d\" (UID: \"8da9d79b-8b83-4a0b-bd07-dcb278ef137a\") " pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.637379 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8da9d79b-8b83-4a0b-bd07-dcb278ef137a-catalog-content\") pod \"community-operators-4pc5d\" (UID: \"8da9d79b-8b83-4a0b-bd07-dcb278ef137a\") " pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.637502 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8da9d79b-8b83-4a0b-bd07-dcb278ef137a-utilities\") pod \"community-operators-4pc5d\" (UID: \"8da9d79b-8b83-4a0b-bd07-dcb278ef137a\") " pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.637590 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrq4b\" (UniqueName: \"kubernetes.io/projected/8da9d79b-8b83-4a0b-bd07-dcb278ef137a-kube-api-access-qrq4b\") pod \"community-operators-4pc5d\" (UID: \"8da9d79b-8b83-4a0b-bd07-dcb278ef137a\") " pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.638138 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8da9d79b-8b83-4a0b-bd07-dcb278ef137a-catalog-content\") pod \"community-operators-4pc5d\" (UID: \"8da9d79b-8b83-4a0b-bd07-dcb278ef137a\") " pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.639513 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8da9d79b-8b83-4a0b-bd07-dcb278ef137a-utilities\") pod \"community-operators-4pc5d\" (UID: \"8da9d79b-8b83-4a0b-bd07-dcb278ef137a\") " pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.652446 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dpwvk"] Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.653985 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.657541 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.677183 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dpwvk"] Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.684845 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrq4b\" (UniqueName: \"kubernetes.io/projected/8da9d79b-8b83-4a0b-bd07-dcb278ef137a-kube-api-access-qrq4b\") pod \"community-operators-4pc5d\" (UID: \"8da9d79b-8b83-4a0b-bd07-dcb278ef137a\") " pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.739591 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e20300-bfe6-466f-ba27-835e4b432705-catalog-content\") pod \"certified-operators-dpwvk\" (UID: \"d8e20300-bfe6-466f-ba27-835e4b432705\") " pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.739724 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd6mx\" (UniqueName: \"kubernetes.io/projected/d8e20300-bfe6-466f-ba27-835e4b432705-kube-api-access-vd6mx\") pod \"certified-operators-dpwvk\" (UID: \"d8e20300-bfe6-466f-ba27-835e4b432705\") " pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.739825 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e20300-bfe6-466f-ba27-835e4b432705-utilities\") pod \"certified-operators-dpwvk\" (UID: \"d8e20300-bfe6-466f-ba27-835e4b432705\") " pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.795034 4708 generic.go:334] "Generic (PLEG): container finished" podID="ee83f4f8-cabc-48ce-a0f2-a5c047a43d85" containerID="9ba65085d2e8f6f647eca186da3c634879e6f49c4712bf2c6a779153bd38a3ea" exitCode=0 Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.795130 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcp6r" event={"ID":"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85","Type":"ContainerDied","Data":"9ba65085d2e8f6f647eca186da3c634879e6f49c4712bf2c6a779153bd38a3ea"} Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.832159 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.841818 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd6mx\" (UniqueName: \"kubernetes.io/projected/d8e20300-bfe6-466f-ba27-835e4b432705-kube-api-access-vd6mx\") pod \"certified-operators-dpwvk\" (UID: \"d8e20300-bfe6-466f-ba27-835e4b432705\") " pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.842044 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e20300-bfe6-466f-ba27-835e4b432705-utilities\") pod \"certified-operators-dpwvk\" (UID: \"d8e20300-bfe6-466f-ba27-835e4b432705\") " pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.842141 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e20300-bfe6-466f-ba27-835e4b432705-catalog-content\") pod \"certified-operators-dpwvk\" (UID: \"d8e20300-bfe6-466f-ba27-835e4b432705\") " pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.843150 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e20300-bfe6-466f-ba27-835e4b432705-catalog-content\") pod \"certified-operators-dpwvk\" (UID: \"d8e20300-bfe6-466f-ba27-835e4b432705\") " pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.843252 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e20300-bfe6-466f-ba27-835e4b432705-utilities\") pod \"certified-operators-dpwvk\" (UID: \"d8e20300-bfe6-466f-ba27-835e4b432705\") " pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.859373 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd6mx\" (UniqueName: \"kubernetes.io/projected/d8e20300-bfe6-466f-ba27-835e4b432705-kube-api-access-vd6mx\") pod \"certified-operators-dpwvk\" (UID: \"d8e20300-bfe6-466f-ba27-835e4b432705\") " pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:38 crc kubenswrapper[4708]: I0320 16:07:38.974337 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:39 crc kubenswrapper[4708]: I0320 16:07:39.229801 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pc5d"] Mar 20 16:07:39 crc kubenswrapper[4708]: W0320 16:07:39.238196 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8da9d79b_8b83_4a0b_bd07_dcb278ef137a.slice/crio-5e30c5e1984b8f7c0b8d3404c8e73afa120d2d436670f946586aabd3452eacc8 WatchSource:0}: Error finding container 5e30c5e1984b8f7c0b8d3404c8e73afa120d2d436670f946586aabd3452eacc8: Status 404 returned error can't find the container with id 5e30c5e1984b8f7c0b8d3404c8e73afa120d2d436670f946586aabd3452eacc8 Mar 20 16:07:39 crc kubenswrapper[4708]: I0320 16:07:39.365546 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dpwvk"] Mar 20 16:07:39 crc kubenswrapper[4708]: I0320 16:07:39.804215 4708 generic.go:334] "Generic (PLEG): container finished" podID="33dad410-3bf5-4f44-a5d8-440d08d47ff3" containerID="2c67e67de8e1d06fcdff80f722ac7fb1475a52d724e71be2d762a47a3c908343" exitCode=0 Mar 20 16:07:39 crc kubenswrapper[4708]: I0320 16:07:39.804338 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mckxb" event={"ID":"33dad410-3bf5-4f44-a5d8-440d08d47ff3","Type":"ContainerDied","Data":"2c67e67de8e1d06fcdff80f722ac7fb1475a52d724e71be2d762a47a3c908343"} Mar 20 16:07:39 crc kubenswrapper[4708]: I0320 16:07:39.808031 4708 generic.go:334] "Generic (PLEG): container finished" podID="8da9d79b-8b83-4a0b-bd07-dcb278ef137a" containerID="00353bb12c758a8fa603a81e5e208fc2d79224d59a0d93750167787eecac94a4" exitCode=0 Mar 20 16:07:39 crc kubenswrapper[4708]: I0320 16:07:39.808102 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pc5d" event={"ID":"8da9d79b-8b83-4a0b-bd07-dcb278ef137a","Type":"ContainerDied","Data":"00353bb12c758a8fa603a81e5e208fc2d79224d59a0d93750167787eecac94a4"} Mar 20 16:07:39 crc kubenswrapper[4708]: I0320 16:07:39.808189 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pc5d" event={"ID":"8da9d79b-8b83-4a0b-bd07-dcb278ef137a","Type":"ContainerStarted","Data":"5e30c5e1984b8f7c0b8d3404c8e73afa120d2d436670f946586aabd3452eacc8"} Mar 20 16:07:39 crc kubenswrapper[4708]: I0320 16:07:39.813976 4708 generic.go:334] "Generic (PLEG): container finished" podID="d8e20300-bfe6-466f-ba27-835e4b432705" containerID="7596b3a34159aa940a252042dd573d827205fb2b8873e57ac659616e1e28c6ea" exitCode=0 Mar 20 16:07:39 crc kubenswrapper[4708]: I0320 16:07:39.814038 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dpwvk" event={"ID":"d8e20300-bfe6-466f-ba27-835e4b432705","Type":"ContainerDied","Data":"7596b3a34159aa940a252042dd573d827205fb2b8873e57ac659616e1e28c6ea"} Mar 20 16:07:39 crc kubenswrapper[4708]: I0320 16:07:39.814086 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dpwvk" event={"ID":"d8e20300-bfe6-466f-ba27-835e4b432705","Type":"ContainerStarted","Data":"5f8525a5f938589ee62232aa2fac2ec699db7210f6ac4abf4b1d9537b5b0650c"} Mar 20 16:07:40 crc kubenswrapper[4708]: I0320 16:07:40.821274 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcp6r" event={"ID":"ee83f4f8-cabc-48ce-a0f2-a5c047a43d85","Type":"ContainerStarted","Data":"52fb936c3a32d37dd4c87b50c2b2edd95744445e073767ed5477205905f36e05"} Mar 20 16:07:40 crc kubenswrapper[4708]: I0320 16:07:40.822947 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pc5d" event={"ID":"8da9d79b-8b83-4a0b-bd07-dcb278ef137a","Type":"ContainerStarted","Data":"e99aa3182f6cd62258fa4d0e76d7de082603edf1219708a25ca2f2baa7fa1afe"} Mar 20 16:07:40 crc kubenswrapper[4708]: I0320 16:07:40.824819 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dpwvk" event={"ID":"d8e20300-bfe6-466f-ba27-835e4b432705","Type":"ContainerStarted","Data":"c1b541e0c670281e45aa05cc5909f2fc2150f935c1a5286f6f715d802e62d6f7"} Mar 20 16:07:40 crc kubenswrapper[4708]: I0320 16:07:40.828264 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mckxb" event={"ID":"33dad410-3bf5-4f44-a5d8-440d08d47ff3","Type":"ContainerStarted","Data":"567db03219c2ce337287fb0a59dd8c043fe7c63ada8353433a85b3ff009bdb6b"} Mar 20 16:07:40 crc kubenswrapper[4708]: I0320 16:07:40.849848 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jcp6r" podStartSLOduration=2.946541858 podStartE2EDuration="4.849818057s" podCreationTimestamp="2026-03-20 16:07:36 +0000 UTC" firstStartedPulling="2026-03-20 16:07:37.783596537 +0000 UTC m=+412.457933252" lastFinishedPulling="2026-03-20 16:07:39.686872736 +0000 UTC m=+414.361209451" observedRunningTime="2026-03-20 16:07:40.845710263 +0000 UTC m=+415.520046978" watchObservedRunningTime="2026-03-20 16:07:40.849818057 +0000 UTC m=+415.524154772" Mar 20 16:07:40 crc kubenswrapper[4708]: I0320 16:07:40.873041 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mckxb" podStartSLOduration=2.420706556 podStartE2EDuration="4.873020438s" podCreationTimestamp="2026-03-20 16:07:36 +0000 UTC" firstStartedPulling="2026-03-20 16:07:37.785221282 +0000 UTC m=+412.459557997" lastFinishedPulling="2026-03-20 16:07:40.237535164 +0000 UTC m=+414.911871879" observedRunningTime="2026-03-20 16:07:40.871521067 +0000 UTC m=+415.545857782" watchObservedRunningTime="2026-03-20 16:07:40.873020438 +0000 UTC m=+415.547357143" Mar 20 16:07:41 crc kubenswrapper[4708]: I0320 16:07:41.495771 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk"] Mar 20 16:07:41 crc kubenswrapper[4708]: I0320 16:07:41.496002 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" podUID="02d4142b-e300-4057-a69e-2c750ce861e6" containerName="route-controller-manager" containerID="cri-o://3d6af093a99ef88bbde28175dfd7195533c85abc323c4fd6b84a33a2dcab32bb" gracePeriod=30 Mar 20 16:07:41 crc kubenswrapper[4708]: I0320 16:07:41.837583 4708 generic.go:334] "Generic (PLEG): container finished" podID="8da9d79b-8b83-4a0b-bd07-dcb278ef137a" containerID="e99aa3182f6cd62258fa4d0e76d7de082603edf1219708a25ca2f2baa7fa1afe" exitCode=0 Mar 20 16:07:41 crc kubenswrapper[4708]: I0320 16:07:41.838943 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pc5d" event={"ID":"8da9d79b-8b83-4a0b-bd07-dcb278ef137a","Type":"ContainerDied","Data":"e99aa3182f6cd62258fa4d0e76d7de082603edf1219708a25ca2f2baa7fa1afe"} Mar 20 16:07:41 crc kubenswrapper[4708]: I0320 16:07:41.849901 4708 generic.go:334] "Generic (PLEG): container finished" podID="d8e20300-bfe6-466f-ba27-835e4b432705" containerID="c1b541e0c670281e45aa05cc5909f2fc2150f935c1a5286f6f715d802e62d6f7" exitCode=0 Mar 20 16:07:41 crc kubenswrapper[4708]: I0320 16:07:41.849982 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dpwvk" event={"ID":"d8e20300-bfe6-466f-ba27-835e4b432705","Type":"ContainerDied","Data":"c1b541e0c670281e45aa05cc5909f2fc2150f935c1a5286f6f715d802e62d6f7"} Mar 20 16:07:41 crc kubenswrapper[4708]: I0320 16:07:41.856200 4708 generic.go:334] "Generic (PLEG): container finished" podID="02d4142b-e300-4057-a69e-2c750ce861e6" containerID="3d6af093a99ef88bbde28175dfd7195533c85abc323c4fd6b84a33a2dcab32bb" exitCode=0 Mar 20 16:07:41 crc kubenswrapper[4708]: I0320 16:07:41.857015 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" event={"ID":"02d4142b-e300-4057-a69e-2c750ce861e6","Type":"ContainerDied","Data":"3d6af093a99ef88bbde28175dfd7195533c85abc323c4fd6b84a33a2dcab32bb"} Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.051660 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.099644 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02d4142b-e300-4057-a69e-2c750ce861e6-client-ca\") pod \"02d4142b-e300-4057-a69e-2c750ce861e6\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.099807 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02d4142b-e300-4057-a69e-2c750ce861e6-config\") pod \"02d4142b-e300-4057-a69e-2c750ce861e6\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.099895 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgx7h\" (UniqueName: \"kubernetes.io/projected/02d4142b-e300-4057-a69e-2c750ce861e6-kube-api-access-jgx7h\") pod \"02d4142b-e300-4057-a69e-2c750ce861e6\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.099969 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02d4142b-e300-4057-a69e-2c750ce861e6-serving-cert\") pod \"02d4142b-e300-4057-a69e-2c750ce861e6\" (UID: \"02d4142b-e300-4057-a69e-2c750ce861e6\") " Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.100587 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02d4142b-e300-4057-a69e-2c750ce861e6-client-ca" (OuterVolumeSpecName: "client-ca") pod "02d4142b-e300-4057-a69e-2c750ce861e6" (UID: "02d4142b-e300-4057-a69e-2c750ce861e6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.101756 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02d4142b-e300-4057-a69e-2c750ce861e6-config" (OuterVolumeSpecName: "config") pod "02d4142b-e300-4057-a69e-2c750ce861e6" (UID: "02d4142b-e300-4057-a69e-2c750ce861e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.106044 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d4142b-e300-4057-a69e-2c750ce861e6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "02d4142b-e300-4057-a69e-2c750ce861e6" (UID: "02d4142b-e300-4057-a69e-2c750ce861e6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.106253 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d4142b-e300-4057-a69e-2c750ce861e6-kube-api-access-jgx7h" (OuterVolumeSpecName: "kube-api-access-jgx7h") pod "02d4142b-e300-4057-a69e-2c750ce861e6" (UID: "02d4142b-e300-4057-a69e-2c750ce861e6"). InnerVolumeSpecName "kube-api-access-jgx7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.201929 4708 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02d4142b-e300-4057-a69e-2c750ce861e6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.201980 4708 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02d4142b-e300-4057-a69e-2c750ce861e6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.201996 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02d4142b-e300-4057-a69e-2c750ce861e6-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.202011 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgx7h\" (UniqueName: \"kubernetes.io/projected/02d4142b-e300-4057-a69e-2c750ce861e6-kube-api-access-jgx7h\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.867925 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pc5d" event={"ID":"8da9d79b-8b83-4a0b-bd07-dcb278ef137a","Type":"ContainerStarted","Data":"10af8bbbc52e3974c56fb906b6d617c2479d50a8e3d8c274027e8c4273b69945"} Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.871391 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dpwvk" event={"ID":"d8e20300-bfe6-466f-ba27-835e4b432705","Type":"ContainerStarted","Data":"5b8418153c7d7f586768fc6a3e48a9ee182f16fd0c26513aa6dc60abf9833d59"} Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.873803 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" event={"ID":"02d4142b-e300-4057-a69e-2c750ce861e6","Type":"ContainerDied","Data":"75e3f5a4e460e18040479086d3bdddafde1ce3b81802142c2863348a1b6039b4"} Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.873877 4708 scope.go:117] "RemoveContainer" containerID="3d6af093a99ef88bbde28175dfd7195533c85abc323c4fd6b84a33a2dcab32bb" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.873891 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.893833 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4pc5d" podStartSLOduration=2.335548057 podStartE2EDuration="4.893808982s" podCreationTimestamp="2026-03-20 16:07:38 +0000 UTC" firstStartedPulling="2026-03-20 16:07:39.810782717 +0000 UTC m=+414.485119432" lastFinishedPulling="2026-03-20 16:07:42.369043642 +0000 UTC m=+417.043380357" observedRunningTime="2026-03-20 16:07:42.886502999 +0000 UTC m=+417.560839714" watchObservedRunningTime="2026-03-20 16:07:42.893808982 +0000 UTC m=+417.568145697" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.913852 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dpwvk" podStartSLOduration=2.372712826 podStartE2EDuration="4.913830936s" podCreationTimestamp="2026-03-20 16:07:38 +0000 UTC" firstStartedPulling="2026-03-20 16:07:39.815324663 +0000 UTC m=+414.489661378" lastFinishedPulling="2026-03-20 16:07:42.356442773 +0000 UTC m=+417.030779488" observedRunningTime="2026-03-20 16:07:42.910614597 +0000 UTC m=+417.584951312" watchObservedRunningTime="2026-03-20 16:07:42.913830936 +0000 UTC m=+417.588167661" Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.923151 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk"] Mar 20 16:07:42 crc kubenswrapper[4708]: I0320 16:07:42.926262 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9f786696-rptvk"] Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.059168 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv"] Mar 20 16:07:43 crc kubenswrapper[4708]: E0320 16:07:43.059400 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d4142b-e300-4057-a69e-2c750ce861e6" containerName="route-controller-manager" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.059419 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d4142b-e300-4057-a69e-2c750ce861e6" containerName="route-controller-manager" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.059561 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d4142b-e300-4057-a69e-2c750ce861e6" containerName="route-controller-manager" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.059998 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.062169 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.062530 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.062937 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.064168 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.064319 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.065049 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.071242 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv"] Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.120302 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddqqp\" (UniqueName: \"kubernetes.io/projected/82e607bd-db04-4a4d-b86d-7b98217c2200-kube-api-access-ddqqp\") pod \"route-controller-manager-6c8ffbd4fb-wbtfv\" (UID: \"82e607bd-db04-4a4d-b86d-7b98217c2200\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.120356 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82e607bd-db04-4a4d-b86d-7b98217c2200-serving-cert\") pod \"route-controller-manager-6c8ffbd4fb-wbtfv\" (UID: \"82e607bd-db04-4a4d-b86d-7b98217c2200\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.120551 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82e607bd-db04-4a4d-b86d-7b98217c2200-config\") pod \"route-controller-manager-6c8ffbd4fb-wbtfv\" (UID: \"82e607bd-db04-4a4d-b86d-7b98217c2200\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.120606 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82e607bd-db04-4a4d-b86d-7b98217c2200-client-ca\") pod \"route-controller-manager-6c8ffbd4fb-wbtfv\" (UID: \"82e607bd-db04-4a4d-b86d-7b98217c2200\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.222341 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddqqp\" (UniqueName: \"kubernetes.io/projected/82e607bd-db04-4a4d-b86d-7b98217c2200-kube-api-access-ddqqp\") pod \"route-controller-manager-6c8ffbd4fb-wbtfv\" (UID: \"82e607bd-db04-4a4d-b86d-7b98217c2200\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.222403 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82e607bd-db04-4a4d-b86d-7b98217c2200-serving-cert\") pod \"route-controller-manager-6c8ffbd4fb-wbtfv\" (UID: \"82e607bd-db04-4a4d-b86d-7b98217c2200\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.222468 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82e607bd-db04-4a4d-b86d-7b98217c2200-config\") pod \"route-controller-manager-6c8ffbd4fb-wbtfv\" (UID: \"82e607bd-db04-4a4d-b86d-7b98217c2200\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.222489 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82e607bd-db04-4a4d-b86d-7b98217c2200-client-ca\") pod \"route-controller-manager-6c8ffbd4fb-wbtfv\" (UID: \"82e607bd-db04-4a4d-b86d-7b98217c2200\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.223394 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82e607bd-db04-4a4d-b86d-7b98217c2200-client-ca\") pod \"route-controller-manager-6c8ffbd4fb-wbtfv\" (UID: \"82e607bd-db04-4a4d-b86d-7b98217c2200\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.223644 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82e607bd-db04-4a4d-b86d-7b98217c2200-config\") pod \"route-controller-manager-6c8ffbd4fb-wbtfv\" (UID: \"82e607bd-db04-4a4d-b86d-7b98217c2200\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.229547 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82e607bd-db04-4a4d-b86d-7b98217c2200-serving-cert\") pod \"route-controller-manager-6c8ffbd4fb-wbtfv\" (UID: \"82e607bd-db04-4a4d-b86d-7b98217c2200\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.241814 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddqqp\" (UniqueName: \"kubernetes.io/projected/82e607bd-db04-4a4d-b86d-7b98217c2200-kube-api-access-ddqqp\") pod \"route-controller-manager-6c8ffbd4fb-wbtfv\" (UID: \"82e607bd-db04-4a4d-b86d-7b98217c2200\") " pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.387842 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.798263 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv"] Mar 20 16:07:43 crc kubenswrapper[4708]: I0320 16:07:43.880714 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" event={"ID":"82e607bd-db04-4a4d-b86d-7b98217c2200","Type":"ContainerStarted","Data":"bef54fa10cc859cf0e53c659e553e5c50da9e002e7de04c0dfff042689b8da4a"} Mar 20 16:07:44 crc kubenswrapper[4708]: I0320 16:07:44.118749 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d4142b-e300-4057-a69e-2c750ce861e6" path="/var/lib/kubelet/pods/02d4142b-e300-4057-a69e-2c750ce861e6/volumes" Mar 20 16:07:44 crc kubenswrapper[4708]: I0320 16:07:44.890076 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" event={"ID":"82e607bd-db04-4a4d-b86d-7b98217c2200","Type":"ContainerStarted","Data":"8631011f98a2422decf555b1dfc3502e05daa1e1b46cec5c979f825d16461bbe"} Mar 20 16:07:44 crc kubenswrapper[4708]: I0320 16:07:44.890440 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:44 crc kubenswrapper[4708]: I0320 16:07:44.899874 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" Mar 20 16:07:44 crc kubenswrapper[4708]: I0320 16:07:44.912362 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c8ffbd4fb-wbtfv" podStartSLOduration=3.912338642 podStartE2EDuration="3.912338642s" podCreationTimestamp="2026-03-20 16:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:07:44.908602819 +0000 UTC m=+419.582939534" watchObservedRunningTime="2026-03-20 16:07:44.912338642 +0000 UTC m=+419.586675357" Mar 20 16:07:46 crc kubenswrapper[4708]: I0320 16:07:46.370540 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:46 crc kubenswrapper[4708]: I0320 16:07:46.370978 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:46 crc kubenswrapper[4708]: I0320 16:07:46.631120 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:46 crc kubenswrapper[4708]: I0320 16:07:46.631215 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:46 crc kubenswrapper[4708]: I0320 16:07:46.678446 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:46 crc kubenswrapper[4708]: I0320 16:07:46.952130 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jcp6r" Mar 20 16:07:47 crc kubenswrapper[4708]: I0320 16:07:47.437854 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mckxb" podUID="33dad410-3bf5-4f44-a5d8-440d08d47ff3" containerName="registry-server" probeResult="failure" output=< Mar 20 16:07:47 crc kubenswrapper[4708]: timeout: failed to connect service ":50051" within 1s Mar 20 16:07:47 crc kubenswrapper[4708]: > Mar 20 16:07:48 crc kubenswrapper[4708]: I0320 16:07:48.832388 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:48 crc kubenswrapper[4708]: I0320 16:07:48.832485 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:48 crc kubenswrapper[4708]: I0320 16:07:48.878711 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:48 crc kubenswrapper[4708]: I0320 16:07:48.962100 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4pc5d" Mar 20 16:07:48 crc kubenswrapper[4708]: I0320 16:07:48.975500 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:48 crc kubenswrapper[4708]: I0320 16:07:48.975594 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:49 crc kubenswrapper[4708]: I0320 16:07:49.020194 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:49 crc kubenswrapper[4708]: I0320 16:07:49.965757 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dpwvk" Mar 20 16:07:52 crc kubenswrapper[4708]: I0320 16:07:52.514266 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" podUID="761e5144-aa8a-4203-b166-b5dc638bfe79" containerName="registry" containerID="cri-o://9f4f407d45537d007361fa80c22a4ea8aca3635eda6b8969ca358f4415d40165" gracePeriod=30 Mar 20 16:07:52 crc kubenswrapper[4708]: I0320 16:07:52.933636 4708 generic.go:334] "Generic (PLEG): container finished" podID="761e5144-aa8a-4203-b166-b5dc638bfe79" containerID="9f4f407d45537d007361fa80c22a4ea8aca3635eda6b8969ca358f4415d40165" exitCode=0 Mar 20 16:07:52 crc kubenswrapper[4708]: I0320 16:07:52.933704 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" event={"ID":"761e5144-aa8a-4203-b166-b5dc638bfe79","Type":"ContainerDied","Data":"9f4f407d45537d007361fa80c22a4ea8aca3635eda6b8969ca358f4415d40165"} Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.034868 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.179162 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"761e5144-aa8a-4203-b166-b5dc638bfe79\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.179244 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/761e5144-aa8a-4203-b166-b5dc638bfe79-installation-pull-secrets\") pod \"761e5144-aa8a-4203-b166-b5dc638bfe79\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.179274 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/761e5144-aa8a-4203-b166-b5dc638bfe79-trusted-ca\") pod \"761e5144-aa8a-4203-b166-b5dc638bfe79\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.179349 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/761e5144-aa8a-4203-b166-b5dc638bfe79-registry-certificates\") pod \"761e5144-aa8a-4203-b166-b5dc638bfe79\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.179385 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xvmh\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-kube-api-access-6xvmh\") pod \"761e5144-aa8a-4203-b166-b5dc638bfe79\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.179416 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-bound-sa-token\") pod \"761e5144-aa8a-4203-b166-b5dc638bfe79\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.179441 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/761e5144-aa8a-4203-b166-b5dc638bfe79-ca-trust-extracted\") pod \"761e5144-aa8a-4203-b166-b5dc638bfe79\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.179475 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-registry-tls\") pod \"761e5144-aa8a-4203-b166-b5dc638bfe79\" (UID: \"761e5144-aa8a-4203-b166-b5dc638bfe79\") " Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.180086 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761e5144-aa8a-4203-b166-b5dc638bfe79-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "761e5144-aa8a-4203-b166-b5dc638bfe79" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.180313 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761e5144-aa8a-4203-b166-b5dc638bfe79-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "761e5144-aa8a-4203-b166-b5dc638bfe79" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.186323 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761e5144-aa8a-4203-b166-b5dc638bfe79-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "761e5144-aa8a-4203-b166-b5dc638bfe79" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.192004 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "761e5144-aa8a-4203-b166-b5dc638bfe79" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.192259 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "761e5144-aa8a-4203-b166-b5dc638bfe79" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.192513 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-kube-api-access-6xvmh" (OuterVolumeSpecName: "kube-api-access-6xvmh") pod "761e5144-aa8a-4203-b166-b5dc638bfe79" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79"). InnerVolumeSpecName "kube-api-access-6xvmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.199966 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "761e5144-aa8a-4203-b166-b5dc638bfe79" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.204099 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/761e5144-aa8a-4203-b166-b5dc638bfe79-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "761e5144-aa8a-4203-b166-b5dc638bfe79" (UID: "761e5144-aa8a-4203-b166-b5dc638bfe79"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.280932 4708 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/761e5144-aa8a-4203-b166-b5dc638bfe79-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.280984 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xvmh\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-kube-api-access-6xvmh\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.280998 4708 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.281010 4708 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/761e5144-aa8a-4203-b166-b5dc638bfe79-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.281028 4708 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/761e5144-aa8a-4203-b166-b5dc638bfe79-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.281044 4708 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/761e5144-aa8a-4203-b166-b5dc638bfe79-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.281055 4708 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/761e5144-aa8a-4203-b166-b5dc638bfe79-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.941985 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" event={"ID":"761e5144-aa8a-4203-b166-b5dc638bfe79","Type":"ContainerDied","Data":"d32517300d2fd960512c82f32c7b881581115340f83df58b03321fdfca63a33e"} Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.942050 4708 scope.go:117] "RemoveContainer" containerID="9f4f407d45537d007361fa80c22a4ea8aca3635eda6b8969ca358f4415d40165" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.942084 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tkdfr" Mar 20 16:07:53 crc kubenswrapper[4708]: I0320 16:07:53.993220 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tkdfr"] Mar 20 16:07:54 crc kubenswrapper[4708]: I0320 16:07:54.001810 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tkdfr"] Mar 20 16:07:54 crc kubenswrapper[4708]: I0320 16:07:54.122122 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761e5144-aa8a-4203-b166-b5dc638bfe79" path="/var/lib/kubelet/pods/761e5144-aa8a-4203-b166-b5dc638bfe79/volumes" Mar 20 16:07:56 crc kubenswrapper[4708]: I0320 16:07:56.178966 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:07:56 crc kubenswrapper[4708]: I0320 16:07:56.179358 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:07:56 crc kubenswrapper[4708]: I0320 16:07:56.407368 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:07:56 crc kubenswrapper[4708]: I0320 16:07:56.442035 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mckxb" Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.136214 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567048-gwjs8"] Mar 20 16:08:00 crc kubenswrapper[4708]: E0320 16:08:00.136868 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761e5144-aa8a-4203-b166-b5dc638bfe79" containerName="registry" Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.136885 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="761e5144-aa8a-4203-b166-b5dc638bfe79" containerName="registry" Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.137012 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="761e5144-aa8a-4203-b166-b5dc638bfe79" containerName="registry" Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.137459 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-gwjs8" Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.142351 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.142567 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.142596 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.145179 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-gwjs8"] Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.267198 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtcrt\" (UniqueName: \"kubernetes.io/projected/6ba792d3-1a9c-4aa8-929f-fa66ba85a87a-kube-api-access-wtcrt\") pod \"auto-csr-approver-29567048-gwjs8\" (UID: \"6ba792d3-1a9c-4aa8-929f-fa66ba85a87a\") " pod="openshift-infra/auto-csr-approver-29567048-gwjs8" Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.368140 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtcrt\" (UniqueName: \"kubernetes.io/projected/6ba792d3-1a9c-4aa8-929f-fa66ba85a87a-kube-api-access-wtcrt\") pod \"auto-csr-approver-29567048-gwjs8\" (UID: \"6ba792d3-1a9c-4aa8-929f-fa66ba85a87a\") " pod="openshift-infra/auto-csr-approver-29567048-gwjs8" Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.385909 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtcrt\" (UniqueName: \"kubernetes.io/projected/6ba792d3-1a9c-4aa8-929f-fa66ba85a87a-kube-api-access-wtcrt\") pod \"auto-csr-approver-29567048-gwjs8\" (UID: \"6ba792d3-1a9c-4aa8-929f-fa66ba85a87a\") " pod="openshift-infra/auto-csr-approver-29567048-gwjs8" Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.460819 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-gwjs8" Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.888238 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-gwjs8"] Mar 20 16:08:00 crc kubenswrapper[4708]: W0320 16:08:00.893701 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ba792d3_1a9c_4aa8_929f_fa66ba85a87a.slice/crio-ed958bda2f9336beb0f87c1705e3463a3da41a8226ae130935e02dc21ff713d0 WatchSource:0}: Error finding container ed958bda2f9336beb0f87c1705e3463a3da41a8226ae130935e02dc21ff713d0: Status 404 returned error can't find the container with id ed958bda2f9336beb0f87c1705e3463a3da41a8226ae130935e02dc21ff713d0 Mar 20 16:08:00 crc kubenswrapper[4708]: I0320 16:08:00.981141 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-gwjs8" event={"ID":"6ba792d3-1a9c-4aa8-929f-fa66ba85a87a","Type":"ContainerStarted","Data":"ed958bda2f9336beb0f87c1705e3463a3da41a8226ae130935e02dc21ff713d0"} Mar 20 16:08:02 crc kubenswrapper[4708]: I0320 16:08:02.994330 4708 generic.go:334] "Generic (PLEG): container finished" podID="6ba792d3-1a9c-4aa8-929f-fa66ba85a87a" containerID="4eccf946645350d85645c5cec6efbd3ad200868cd3f27bfe46981cb45c67e972" exitCode=0 Mar 20 16:08:02 crc kubenswrapper[4708]: I0320 16:08:02.994386 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-gwjs8" event={"ID":"6ba792d3-1a9c-4aa8-929f-fa66ba85a87a","Type":"ContainerDied","Data":"4eccf946645350d85645c5cec6efbd3ad200868cd3f27bfe46981cb45c67e972"} Mar 20 16:08:04 crc kubenswrapper[4708]: I0320 16:08:04.346750 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-gwjs8" Mar 20 16:08:04 crc kubenswrapper[4708]: I0320 16:08:04.420930 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtcrt\" (UniqueName: \"kubernetes.io/projected/6ba792d3-1a9c-4aa8-929f-fa66ba85a87a-kube-api-access-wtcrt\") pod \"6ba792d3-1a9c-4aa8-929f-fa66ba85a87a\" (UID: \"6ba792d3-1a9c-4aa8-929f-fa66ba85a87a\") " Mar 20 16:08:04 crc kubenswrapper[4708]: I0320 16:08:04.427874 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba792d3-1a9c-4aa8-929f-fa66ba85a87a-kube-api-access-wtcrt" (OuterVolumeSpecName: "kube-api-access-wtcrt") pod "6ba792d3-1a9c-4aa8-929f-fa66ba85a87a" (UID: "6ba792d3-1a9c-4aa8-929f-fa66ba85a87a"). InnerVolumeSpecName "kube-api-access-wtcrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:08:04 crc kubenswrapper[4708]: I0320 16:08:04.522352 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtcrt\" (UniqueName: \"kubernetes.io/projected/6ba792d3-1a9c-4aa8-929f-fa66ba85a87a-kube-api-access-wtcrt\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:05 crc kubenswrapper[4708]: I0320 16:08:05.023105 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-gwjs8" event={"ID":"6ba792d3-1a9c-4aa8-929f-fa66ba85a87a","Type":"ContainerDied","Data":"ed958bda2f9336beb0f87c1705e3463a3da41a8226ae130935e02dc21ff713d0"} Mar 20 16:08:05 crc kubenswrapper[4708]: I0320 16:08:05.023192 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed958bda2f9336beb0f87c1705e3463a3da41a8226ae130935e02dc21ff713d0" Mar 20 16:08:05 crc kubenswrapper[4708]: I0320 16:08:05.023195 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-gwjs8" Mar 20 16:08:26 crc kubenswrapper[4708]: I0320 16:08:26.178706 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:08:26 crc kubenswrapper[4708]: I0320 16:08:26.179290 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:08:56 crc kubenswrapper[4708]: I0320 16:08:56.178907 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:08:56 crc kubenswrapper[4708]: I0320 16:08:56.179516 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:08:56 crc kubenswrapper[4708]: I0320 16:08:56.179574 4708 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:08:56 crc kubenswrapper[4708]: I0320 16:08:56.180324 4708 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"340f6ff788bab07854dcb09b786205fe1caea2674ea8aaca989428a747316c8f"} pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:08:56 crc kubenswrapper[4708]: I0320 16:08:56.180393 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" containerID="cri-o://340f6ff788bab07854dcb09b786205fe1caea2674ea8aaca989428a747316c8f" gracePeriod=600 Mar 20 16:08:56 crc kubenswrapper[4708]: I0320 16:08:56.338720 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerDied","Data":"340f6ff788bab07854dcb09b786205fe1caea2674ea8aaca989428a747316c8f"} Mar 20 16:08:56 crc kubenswrapper[4708]: I0320 16:08:56.338805 4708 scope.go:117] "RemoveContainer" containerID="d6ec59ed8d66432bb13b29d4fa3a7ccf2676c91d4085cd9c251935a8f6c26bd1" Mar 20 16:08:56 crc kubenswrapper[4708]: I0320 16:08:56.338711 4708 generic.go:334] "Generic (PLEG): container finished" podID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerID="340f6ff788bab07854dcb09b786205fe1caea2674ea8aaca989428a747316c8f" exitCode=0 Mar 20 16:08:57 crc kubenswrapper[4708]: I0320 16:08:57.354016 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerStarted","Data":"86c51de47a3ecb84e60b79380374d49d0675ffee3378ce7301ca5717dc50a07c"} Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.148518 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567050-gqgxj"] Mar 20 16:10:00 crc kubenswrapper[4708]: E0320 16:10:00.149444 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba792d3-1a9c-4aa8-929f-fa66ba85a87a" containerName="oc" Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.149463 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba792d3-1a9c-4aa8-929f-fa66ba85a87a" containerName="oc" Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.149641 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba792d3-1a9c-4aa8-929f-fa66ba85a87a" containerName="oc" Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.150218 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-gqgxj" Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.152819 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.152978 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.152709 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.153190 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-gqgxj"] Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.316780 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hggd8\" (UniqueName: \"kubernetes.io/projected/e1ddd964-0a24-464f-82f0-99152d1b7839-kube-api-access-hggd8\") pod \"auto-csr-approver-29567050-gqgxj\" (UID: \"e1ddd964-0a24-464f-82f0-99152d1b7839\") " pod="openshift-infra/auto-csr-approver-29567050-gqgxj" Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.417968 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hggd8\" (UniqueName: \"kubernetes.io/projected/e1ddd964-0a24-464f-82f0-99152d1b7839-kube-api-access-hggd8\") pod \"auto-csr-approver-29567050-gqgxj\" (UID: \"e1ddd964-0a24-464f-82f0-99152d1b7839\") " pod="openshift-infra/auto-csr-approver-29567050-gqgxj" Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.443206 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hggd8\" (UniqueName: \"kubernetes.io/projected/e1ddd964-0a24-464f-82f0-99152d1b7839-kube-api-access-hggd8\") pod \"auto-csr-approver-29567050-gqgxj\" (UID: \"e1ddd964-0a24-464f-82f0-99152d1b7839\") " pod="openshift-infra/auto-csr-approver-29567050-gqgxj" Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.471882 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-gqgxj" Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.940947 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-gqgxj"] Mar 20 16:10:00 crc kubenswrapper[4708]: I0320 16:10:00.951302 4708 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:10:01 crc kubenswrapper[4708]: I0320 16:10:01.796029 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-gqgxj" event={"ID":"e1ddd964-0a24-464f-82f0-99152d1b7839","Type":"ContainerStarted","Data":"a0dfb18bb1cb2be8e7ac4509c7cf226e513774b74b43d2b6c6a6149d96f5af41"} Mar 20 16:10:02 crc kubenswrapper[4708]: I0320 16:10:02.802434 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-gqgxj" event={"ID":"e1ddd964-0a24-464f-82f0-99152d1b7839","Type":"ContainerStarted","Data":"f470bd0f3ccfa0a28663b9f7725ba6373d4c96ab2ee74fca2fe3e04bd0b68f80"} Mar 20 16:10:02 crc kubenswrapper[4708]: I0320 16:10:02.817022 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567050-gqgxj" podStartSLOduration=1.302136688 podStartE2EDuration="2.817006542s" podCreationTimestamp="2026-03-20 16:10:00 +0000 UTC" firstStartedPulling="2026-03-20 16:10:00.951060251 +0000 UTC m=+555.625396976" lastFinishedPulling="2026-03-20 16:10:02.465930075 +0000 UTC m=+557.140266830" observedRunningTime="2026-03-20 16:10:02.81656268 +0000 UTC m=+557.490899395" watchObservedRunningTime="2026-03-20 16:10:02.817006542 +0000 UTC m=+557.491343257" Mar 20 16:10:03 crc kubenswrapper[4708]: I0320 16:10:03.809451 4708 generic.go:334] "Generic (PLEG): container finished" podID="e1ddd964-0a24-464f-82f0-99152d1b7839" containerID="f470bd0f3ccfa0a28663b9f7725ba6373d4c96ab2ee74fca2fe3e04bd0b68f80" exitCode=0 Mar 20 16:10:03 crc kubenswrapper[4708]: I0320 16:10:03.809531 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-gqgxj" event={"ID":"e1ddd964-0a24-464f-82f0-99152d1b7839","Type":"ContainerDied","Data":"f470bd0f3ccfa0a28663b9f7725ba6373d4c96ab2ee74fca2fe3e04bd0b68f80"} Mar 20 16:10:05 crc kubenswrapper[4708]: I0320 16:10:05.019510 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-gqgxj" Mar 20 16:10:05 crc kubenswrapper[4708]: I0320 16:10:05.179370 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hggd8\" (UniqueName: \"kubernetes.io/projected/e1ddd964-0a24-464f-82f0-99152d1b7839-kube-api-access-hggd8\") pod \"e1ddd964-0a24-464f-82f0-99152d1b7839\" (UID: \"e1ddd964-0a24-464f-82f0-99152d1b7839\") " Mar 20 16:10:05 crc kubenswrapper[4708]: I0320 16:10:05.184907 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ddd964-0a24-464f-82f0-99152d1b7839-kube-api-access-hggd8" (OuterVolumeSpecName: "kube-api-access-hggd8") pod "e1ddd964-0a24-464f-82f0-99152d1b7839" (UID: "e1ddd964-0a24-464f-82f0-99152d1b7839"). InnerVolumeSpecName "kube-api-access-hggd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:10:05 crc kubenswrapper[4708]: I0320 16:10:05.281308 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hggd8\" (UniqueName: \"kubernetes.io/projected/e1ddd964-0a24-464f-82f0-99152d1b7839-kube-api-access-hggd8\") on node \"crc\" DevicePath \"\"" Mar 20 16:10:05 crc kubenswrapper[4708]: I0320 16:10:05.822412 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-gqgxj" event={"ID":"e1ddd964-0a24-464f-82f0-99152d1b7839","Type":"ContainerDied","Data":"a0dfb18bb1cb2be8e7ac4509c7cf226e513774b74b43d2b6c6a6149d96f5af41"} Mar 20 16:10:05 crc kubenswrapper[4708]: I0320 16:10:05.822454 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0dfb18bb1cb2be8e7ac4509c7cf226e513774b74b43d2b6c6a6149d96f5af41" Mar 20 16:10:05 crc kubenswrapper[4708]: I0320 16:10:05.822482 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-gqgxj" Mar 20 16:10:05 crc kubenswrapper[4708]: I0320 16:10:05.861886 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-8vwl5"] Mar 20 16:10:05 crc kubenswrapper[4708]: I0320 16:10:05.864919 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-8vwl5"] Mar 20 16:10:06 crc kubenswrapper[4708]: I0320 16:10:06.117743 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69fd6ec4-db77-4309-977a-7e80359aec50" path="/var/lib/kubelet/pods/69fd6ec4-db77-4309-977a-7e80359aec50/volumes" Mar 20 16:10:56 crc kubenswrapper[4708]: I0320 16:10:56.179154 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:10:56 crc kubenswrapper[4708]: I0320 16:10:56.180077 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:11:26 crc kubenswrapper[4708]: I0320 16:11:26.178124 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:11:26 crc kubenswrapper[4708]: I0320 16:11:26.178649 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:11:56 crc kubenswrapper[4708]: I0320 16:11:56.178469 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:11:56 crc kubenswrapper[4708]: I0320 16:11:56.179025 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:11:56 crc kubenswrapper[4708]: I0320 16:11:56.179070 4708 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:11:56 crc kubenswrapper[4708]: I0320 16:11:56.179611 4708 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86c51de47a3ecb84e60b79380374d49d0675ffee3378ce7301ca5717dc50a07c"} pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:11:56 crc kubenswrapper[4708]: I0320 16:11:56.179657 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" containerID="cri-o://86c51de47a3ecb84e60b79380374d49d0675ffee3378ce7301ca5717dc50a07c" gracePeriod=600 Mar 20 16:11:56 crc kubenswrapper[4708]: I0320 16:11:56.496199 4708 generic.go:334] "Generic (PLEG): container finished" podID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerID="86c51de47a3ecb84e60b79380374d49d0675ffee3378ce7301ca5717dc50a07c" exitCode=0 Mar 20 16:11:56 crc kubenswrapper[4708]: I0320 16:11:56.496287 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerDied","Data":"86c51de47a3ecb84e60b79380374d49d0675ffee3378ce7301ca5717dc50a07c"} Mar 20 16:11:56 crc kubenswrapper[4708]: I0320 16:11:56.496606 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerStarted","Data":"d5c914e606937f3e664e2e028c8e0d408cfdd520b7dcf21b2c2c87ba6f561a9d"} Mar 20 16:11:56 crc kubenswrapper[4708]: I0320 16:11:56.496683 4708 scope.go:117] "RemoveContainer" containerID="340f6ff788bab07854dcb09b786205fe1caea2674ea8aaca989428a747316c8f" Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.148348 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567052-hg5th"] Mar 20 16:12:00 crc kubenswrapper[4708]: E0320 16:12:00.149318 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ddd964-0a24-464f-82f0-99152d1b7839" containerName="oc" Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.149342 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ddd964-0a24-464f-82f0-99152d1b7839" containerName="oc" Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.149518 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ddd964-0a24-464f-82f0-99152d1b7839" containerName="oc" Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.150065 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-hg5th" Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.156205 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.156352 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.156439 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.162399 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-hg5th"] Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.214305 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmwxg\" (UniqueName: \"kubernetes.io/projected/6ba33465-4320-42d8-9ac4-eff3f7d5bca8-kube-api-access-kmwxg\") pod \"auto-csr-approver-29567052-hg5th\" (UID: \"6ba33465-4320-42d8-9ac4-eff3f7d5bca8\") " pod="openshift-infra/auto-csr-approver-29567052-hg5th" Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.316476 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmwxg\" (UniqueName: \"kubernetes.io/projected/6ba33465-4320-42d8-9ac4-eff3f7d5bca8-kube-api-access-kmwxg\") pod \"auto-csr-approver-29567052-hg5th\" (UID: \"6ba33465-4320-42d8-9ac4-eff3f7d5bca8\") " pod="openshift-infra/auto-csr-approver-29567052-hg5th" Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.348785 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmwxg\" (UniqueName: \"kubernetes.io/projected/6ba33465-4320-42d8-9ac4-eff3f7d5bca8-kube-api-access-kmwxg\") pod \"auto-csr-approver-29567052-hg5th\" (UID: \"6ba33465-4320-42d8-9ac4-eff3f7d5bca8\") " pod="openshift-infra/auto-csr-approver-29567052-hg5th" Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.475878 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-hg5th" Mar 20 16:12:00 crc kubenswrapper[4708]: I0320 16:12:00.676378 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-hg5th"] Mar 20 16:12:01 crc kubenswrapper[4708]: I0320 16:12:01.532177 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-hg5th" event={"ID":"6ba33465-4320-42d8-9ac4-eff3f7d5bca8","Type":"ContainerStarted","Data":"5665479b3b2d98106607cb25dc372a57bd8655178ccd1a05662f873ae674f468"} Mar 20 16:12:03 crc kubenswrapper[4708]: I0320 16:12:03.545883 4708 generic.go:334] "Generic (PLEG): container finished" podID="6ba33465-4320-42d8-9ac4-eff3f7d5bca8" containerID="2dd799a0f98e8e8abcf4f9b4ce9724929354020078d9d4633d99a9d33d752054" exitCode=0 Mar 20 16:12:03 crc kubenswrapper[4708]: I0320 16:12:03.546157 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-hg5th" event={"ID":"6ba33465-4320-42d8-9ac4-eff3f7d5bca8","Type":"ContainerDied","Data":"2dd799a0f98e8e8abcf4f9b4ce9724929354020078d9d4633d99a9d33d752054"} Mar 20 16:12:04 crc kubenswrapper[4708]: I0320 16:12:04.804470 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-hg5th" Mar 20 16:12:04 crc kubenswrapper[4708]: I0320 16:12:04.979541 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmwxg\" (UniqueName: \"kubernetes.io/projected/6ba33465-4320-42d8-9ac4-eff3f7d5bca8-kube-api-access-kmwxg\") pod \"6ba33465-4320-42d8-9ac4-eff3f7d5bca8\" (UID: \"6ba33465-4320-42d8-9ac4-eff3f7d5bca8\") " Mar 20 16:12:04 crc kubenswrapper[4708]: I0320 16:12:04.984929 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba33465-4320-42d8-9ac4-eff3f7d5bca8-kube-api-access-kmwxg" (OuterVolumeSpecName: "kube-api-access-kmwxg") pod "6ba33465-4320-42d8-9ac4-eff3f7d5bca8" (UID: "6ba33465-4320-42d8-9ac4-eff3f7d5bca8"). InnerVolumeSpecName "kube-api-access-kmwxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:12:05 crc kubenswrapper[4708]: I0320 16:12:05.081994 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmwxg\" (UniqueName: \"kubernetes.io/projected/6ba33465-4320-42d8-9ac4-eff3f7d5bca8-kube-api-access-kmwxg\") on node \"crc\" DevicePath \"\"" Mar 20 16:12:05 crc kubenswrapper[4708]: I0320 16:12:05.558654 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-hg5th" event={"ID":"6ba33465-4320-42d8-9ac4-eff3f7d5bca8","Type":"ContainerDied","Data":"5665479b3b2d98106607cb25dc372a57bd8655178ccd1a05662f873ae674f468"} Mar 20 16:12:05 crc kubenswrapper[4708]: I0320 16:12:05.558709 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5665479b3b2d98106607cb25dc372a57bd8655178ccd1a05662f873ae674f468" Mar 20 16:12:05 crc kubenswrapper[4708]: I0320 16:12:05.558685 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-hg5th" Mar 20 16:12:05 crc kubenswrapper[4708]: I0320 16:12:05.869412 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-99wtr"] Mar 20 16:12:05 crc kubenswrapper[4708]: I0320 16:12:05.876226 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-99wtr"] Mar 20 16:12:06 crc kubenswrapper[4708]: I0320 16:12:06.124238 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb" path="/var/lib/kubelet/pods/cabea874-7e0b-4aab-a7a5-b5c5fcfc78bb/volumes" Mar 20 16:12:10 crc kubenswrapper[4708]: I0320 16:12:10.951050 4708 scope.go:117] "RemoveContainer" containerID="641fb982898e899e43560465ef8157ebbd867e59cf3b51e230c9ea13b27cba66" Mar 20 16:13:11 crc kubenswrapper[4708]: I0320 16:13:11.029169 4708 scope.go:117] "RemoveContainer" containerID="43eda6186390c28f2015dd7ac4c88acf45551a4ec2f89997a3a9fdc781204fd4" Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.944197 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-w7lk6"] Mar 20 16:13:48 crc kubenswrapper[4708]: E0320 16:13:48.945227 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba33465-4320-42d8-9ac4-eff3f7d5bca8" containerName="oc" Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.945247 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba33465-4320-42d8-9ac4-eff3f7d5bca8" containerName="oc" Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.945365 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba33465-4320-42d8-9ac4-eff3f7d5bca8" containerName="oc" Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.945826 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-w7lk6" Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.947242 4708 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2zd7l" Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.947534 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.947877 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.956395 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-jdvxw"] Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.957004 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-jdvxw" Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.958244 4708 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-f4mtg" Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.961087 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-w7lk6"] Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.969050 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-jdvxw"] Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.986362 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-tgmps"] Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.987312 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-tgmps" Mar 20 16:13:48 crc kubenswrapper[4708]: I0320 16:13:48.991120 4708 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pbbdf" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.013766 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-tgmps"] Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.059404 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tljr4\" (UniqueName: \"kubernetes.io/projected/fa91730d-98e2-4cdf-a110-b3f8a4a95731-kube-api-access-tljr4\") pod \"cert-manager-webhook-687f57d79b-tgmps\" (UID: \"fa91730d-98e2-4cdf-a110-b3f8a4a95731\") " pod="cert-manager/cert-manager-webhook-687f57d79b-tgmps" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.059506 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktrc8\" (UniqueName: \"kubernetes.io/projected/720cff92-f259-40b0-a7cb-efa0a67b8ff4-kube-api-access-ktrc8\") pod \"cert-manager-858654f9db-jdvxw\" (UID: \"720cff92-f259-40b0-a7cb-efa0a67b8ff4\") " pod="cert-manager/cert-manager-858654f9db-jdvxw" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.059548 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8qjg\" (UniqueName: \"kubernetes.io/projected/49e88ccc-7491-4ca3-8f14-4e5a82093d0b-kube-api-access-p8qjg\") pod \"cert-manager-cainjector-cf98fcc89-w7lk6\" (UID: \"49e88ccc-7491-4ca3-8f14-4e5a82093d0b\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-w7lk6" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.161009 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tljr4\" (UniqueName: \"kubernetes.io/projected/fa91730d-98e2-4cdf-a110-b3f8a4a95731-kube-api-access-tljr4\") pod \"cert-manager-webhook-687f57d79b-tgmps\" (UID: \"fa91730d-98e2-4cdf-a110-b3f8a4a95731\") " pod="cert-manager/cert-manager-webhook-687f57d79b-tgmps" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.161146 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktrc8\" (UniqueName: \"kubernetes.io/projected/720cff92-f259-40b0-a7cb-efa0a67b8ff4-kube-api-access-ktrc8\") pod \"cert-manager-858654f9db-jdvxw\" (UID: \"720cff92-f259-40b0-a7cb-efa0a67b8ff4\") " pod="cert-manager/cert-manager-858654f9db-jdvxw" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.161197 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8qjg\" (UniqueName: \"kubernetes.io/projected/49e88ccc-7491-4ca3-8f14-4e5a82093d0b-kube-api-access-p8qjg\") pod \"cert-manager-cainjector-cf98fcc89-w7lk6\" (UID: \"49e88ccc-7491-4ca3-8f14-4e5a82093d0b\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-w7lk6" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.184560 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktrc8\" (UniqueName: \"kubernetes.io/projected/720cff92-f259-40b0-a7cb-efa0a67b8ff4-kube-api-access-ktrc8\") pod \"cert-manager-858654f9db-jdvxw\" (UID: \"720cff92-f259-40b0-a7cb-efa0a67b8ff4\") " pod="cert-manager/cert-manager-858654f9db-jdvxw" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.186394 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8qjg\" (UniqueName: \"kubernetes.io/projected/49e88ccc-7491-4ca3-8f14-4e5a82093d0b-kube-api-access-p8qjg\") pod \"cert-manager-cainjector-cf98fcc89-w7lk6\" (UID: \"49e88ccc-7491-4ca3-8f14-4e5a82093d0b\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-w7lk6" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.187364 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tljr4\" (UniqueName: \"kubernetes.io/projected/fa91730d-98e2-4cdf-a110-b3f8a4a95731-kube-api-access-tljr4\") pod \"cert-manager-webhook-687f57d79b-tgmps\" (UID: \"fa91730d-98e2-4cdf-a110-b3f8a4a95731\") " pod="cert-manager/cert-manager-webhook-687f57d79b-tgmps" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.268444 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-w7lk6" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.276796 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-jdvxw" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.308004 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-tgmps" Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.589200 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-tgmps"] Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.738463 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-w7lk6"] Mar 20 16:13:49 crc kubenswrapper[4708]: I0320 16:13:49.741320 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-jdvxw"] Mar 20 16:13:49 crc kubenswrapper[4708]: W0320 16:13:49.745319 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49e88ccc_7491_4ca3_8f14_4e5a82093d0b.slice/crio-6c824300a4163340d046e254d1e63822711547584da3a9e4fb86ee0fad35dcb9 WatchSource:0}: Error finding container 6c824300a4163340d046e254d1e63822711547584da3a9e4fb86ee0fad35dcb9: Status 404 returned error can't find the container with id 6c824300a4163340d046e254d1e63822711547584da3a9e4fb86ee0fad35dcb9 Mar 20 16:13:49 crc kubenswrapper[4708]: W0320 16:13:49.747036 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod720cff92_f259_40b0_a7cb_efa0a67b8ff4.slice/crio-efb04433c4e524c08b550515cda8a80fe5773caa8f90442008a71f2b52816299 WatchSource:0}: Error finding container efb04433c4e524c08b550515cda8a80fe5773caa8f90442008a71f2b52816299: Status 404 returned error can't find the container with id efb04433c4e524c08b550515cda8a80fe5773caa8f90442008a71f2b52816299 Mar 20 16:13:50 crc kubenswrapper[4708]: I0320 16:13:50.222303 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-w7lk6" event={"ID":"49e88ccc-7491-4ca3-8f14-4e5a82093d0b","Type":"ContainerStarted","Data":"6c824300a4163340d046e254d1e63822711547584da3a9e4fb86ee0fad35dcb9"} Mar 20 16:13:50 crc kubenswrapper[4708]: I0320 16:13:50.223801 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-tgmps" event={"ID":"fa91730d-98e2-4cdf-a110-b3f8a4a95731","Type":"ContainerStarted","Data":"83a74d23d90985c9e1f3a04698b980985f086864a0b1db2fb23ebec7ab7ffd3d"} Mar 20 16:13:50 crc kubenswrapper[4708]: I0320 16:13:50.224838 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-jdvxw" event={"ID":"720cff92-f259-40b0-a7cb-efa0a67b8ff4","Type":"ContainerStarted","Data":"efb04433c4e524c08b550515cda8a80fe5773caa8f90442008a71f2b52816299"} Mar 20 16:13:52 crc kubenswrapper[4708]: I0320 16:13:52.239760 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-w7lk6" event={"ID":"49e88ccc-7491-4ca3-8f14-4e5a82093d0b","Type":"ContainerStarted","Data":"88d65b093799829b9626a0110155d668ea4d74403c788684fb97cde02e71bf2f"} Mar 20 16:13:52 crc kubenswrapper[4708]: I0320 16:13:52.255126 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-w7lk6" podStartSLOduration=2.094938622 podStartE2EDuration="4.255104174s" podCreationTimestamp="2026-03-20 16:13:48 +0000 UTC" firstStartedPulling="2026-03-20 16:13:49.748053661 +0000 UTC m=+784.422390376" lastFinishedPulling="2026-03-20 16:13:51.908219213 +0000 UTC m=+786.582555928" observedRunningTime="2026-03-20 16:13:52.254320914 +0000 UTC m=+786.928657639" watchObservedRunningTime="2026-03-20 16:13:52.255104174 +0000 UTC m=+786.929440909" Mar 20 16:13:54 crc kubenswrapper[4708]: I0320 16:13:54.250751 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-tgmps" event={"ID":"fa91730d-98e2-4cdf-a110-b3f8a4a95731","Type":"ContainerStarted","Data":"3854af6ab8774e955c9f92fc0e054ad0b171a07f66db3d80950f24881b115bfb"} Mar 20 16:13:54 crc kubenswrapper[4708]: I0320 16:13:54.252628 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-tgmps" Mar 20 16:13:54 crc kubenswrapper[4708]: I0320 16:13:54.254058 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-jdvxw" event={"ID":"720cff92-f259-40b0-a7cb-efa0a67b8ff4","Type":"ContainerStarted","Data":"c597b20e8a98545f48f27c1eddc235af355120385c82943598c7cd616444e9e8"} Mar 20 16:13:54 crc kubenswrapper[4708]: I0320 16:13:54.678814 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-tgmps" podStartSLOduration=2.6110020179999998 podStartE2EDuration="6.678795828s" podCreationTimestamp="2026-03-20 16:13:48 +0000 UTC" firstStartedPulling="2026-03-20 16:13:49.601579475 +0000 UTC m=+784.275916180" lastFinishedPulling="2026-03-20 16:13:53.669373285 +0000 UTC m=+788.343709990" observedRunningTime="2026-03-20 16:13:54.677835932 +0000 UTC m=+789.352172657" watchObservedRunningTime="2026-03-20 16:13:54.678795828 +0000 UTC m=+789.353132543" Mar 20 16:13:54 crc kubenswrapper[4708]: I0320 16:13:54.694065 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-jdvxw" podStartSLOduration=2.676007636 podStartE2EDuration="6.694044518s" podCreationTimestamp="2026-03-20 16:13:48 +0000 UTC" firstStartedPulling="2026-03-20 16:13:49.748909794 +0000 UTC m=+784.423246509" lastFinishedPulling="2026-03-20 16:13:53.766946666 +0000 UTC m=+788.441283391" observedRunningTime="2026-03-20 16:13:54.693239035 +0000 UTC m=+789.367575750" watchObservedRunningTime="2026-03-20 16:13:54.694044518 +0000 UTC m=+789.368381253" Mar 20 16:13:56 crc kubenswrapper[4708]: I0320 16:13:56.178534 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:13:56 crc kubenswrapper[4708]: I0320 16:13:56.179779 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:13:58 crc kubenswrapper[4708]: I0320 16:13:58.805123 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rcmhv"] Mar 20 16:13:58 crc kubenswrapper[4708]: I0320 16:13:58.805480 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovn-controller" containerID="cri-o://395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d" gracePeriod=30 Mar 20 16:13:58 crc kubenswrapper[4708]: I0320 16:13:58.805492 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="nbdb" containerID="cri-o://52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84" gracePeriod=30 Mar 20 16:13:58 crc kubenswrapper[4708]: I0320 16:13:58.805589 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="northd" containerID="cri-o://1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79" gracePeriod=30 Mar 20 16:13:58 crc kubenswrapper[4708]: I0320 16:13:58.805619 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8" gracePeriod=30 Mar 20 16:13:58 crc kubenswrapper[4708]: I0320 16:13:58.805649 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="kube-rbac-proxy-node" containerID="cri-o://533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4" gracePeriod=30 Mar 20 16:13:58 crc kubenswrapper[4708]: I0320 16:13:58.805705 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovn-acl-logging" containerID="cri-o://a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2" gracePeriod=30 Mar 20 16:13:58 crc kubenswrapper[4708]: I0320 16:13:58.805919 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="sbdb" containerID="cri-o://d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2" gracePeriod=30 Mar 20 16:13:58 crc kubenswrapper[4708]: I0320 16:13:58.851271 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" containerID="cri-o://81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c" gracePeriod=30 Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.104464 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/3.log" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.106741 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovn-acl-logging/0.log" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.107243 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovn-controller/0.log" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.107731 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142199 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovnkube-script-lib\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142260 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-kubelet\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142285 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-run-netns\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142315 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-node-log\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142341 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-var-lib-openvswitch\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142362 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-cni-bin\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142390 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-env-overrides\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142410 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-run-ovn-kubernetes\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142443 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-slash\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142482 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-systemd\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142503 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-openvswitch\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142526 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovnkube-config\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142542 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-log-socket\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142570 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-cni-netd\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142488 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-node-log" (OuterVolumeSpecName: "node-log") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142528 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142557 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-slash" (OuterVolumeSpecName: "host-slash") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142576 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142554 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142597 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142653 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143071 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143125 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143142 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143168 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.142596 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-systemd-units\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143171 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-log-socket" (OuterVolumeSpecName: "log-socket") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143189 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143297 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-ovn\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143330 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovn-node-metrics-cert\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143426 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143482 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-etc-openvswitch\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143125 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143510 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fwwd\" (UniqueName: \"kubernetes.io/projected/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-kube-api-access-8fwwd\") pod \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\" (UID: \"079cc7a0-ceb7-4921-b022-bbe67ae0fad5\") " Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143326 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143554 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143583 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143870 4708 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143887 4708 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143897 4708 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143907 4708 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143931 4708 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143962 4708 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143971 4708 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.143981 4708 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.144044 4708 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.144057 4708 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.144082 4708 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.144091 4708 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.144099 4708 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.144107 4708 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.144115 4708 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.144124 4708 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.144132 4708 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.154122 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.154313 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-kube-api-access-8fwwd" (OuterVolumeSpecName: "kube-api-access-8fwwd") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "kube-api-access-8fwwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161138 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9c9pr"] Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161402 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="kubecfg-setup" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161415 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="kubecfg-setup" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161427 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="kube-rbac-proxy-node" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161433 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="kube-rbac-proxy-node" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161442 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161449 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161457 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161462 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161471 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161478 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161487 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovn-acl-logging" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161494 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovn-acl-logging" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161504 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="nbdb" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161510 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="nbdb" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161517 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="sbdb" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161523 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="sbdb" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161533 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161539 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161546 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161552 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161560 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="northd" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161566 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="northd" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161574 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovn-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161579 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovn-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161720 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161731 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161738 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161747 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovn-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161752 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161760 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="nbdb" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161766 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="sbdb" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161772 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="kube-rbac-proxy-node" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161781 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovn-acl-logging" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161790 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161797 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="northd" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.161878 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161885 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.161989 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerName="ovnkube-controller" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.164052 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.164536 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "079cc7a0-ceb7-4921-b022-bbe67ae0fad5" (UID: "079cc7a0-ceb7-4921-b022-bbe67ae0fad5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.244853 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-log-socket\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.244895 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-run-ovn-kubernetes\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.244928 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-cni-bin\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.244995 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-cni-netd\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245059 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-node-log\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245136 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsggb\" (UniqueName: \"kubernetes.io/projected/6913edfc-f3a6-4249-bdaf-f13baa815fcf-kube-api-access-rsggb\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245169 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-run-ovn\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245191 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245269 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-kubelet\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245444 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-run-netns\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245477 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6913edfc-f3a6-4249-bdaf-f13baa815fcf-ovnkube-config\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245500 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6913edfc-f3a6-4249-bdaf-f13baa815fcf-ovn-node-metrics-cert\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245525 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-var-lib-openvswitch\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245572 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6913edfc-f3a6-4249-bdaf-f13baa815fcf-ovnkube-script-lib\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245594 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-etc-openvswitch\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245617 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-slash\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245659 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-systemd-units\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245691 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-run-openvswitch\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245709 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-run-systemd\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245726 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6913edfc-f3a6-4249-bdaf-f13baa815fcf-env-overrides\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245796 4708 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245816 4708 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.245829 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fwwd\" (UniqueName: \"kubernetes.io/projected/079cc7a0-ceb7-4921-b022-bbe67ae0fad5-kube-api-access-8fwwd\") on node \"crc\" DevicePath \"\"" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.277469 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovnkube-controller/3.log" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.279912 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovn-acl-logging/0.log" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.280613 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rcmhv_079cc7a0-ceb7-4921-b022-bbe67ae0fad5/ovn-controller/0.log" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281042 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c" exitCode=0 Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281074 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2" exitCode=0 Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281084 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84" exitCode=0 Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281095 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79" exitCode=0 Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281105 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8" exitCode=0 Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281117 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4" exitCode=0 Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281125 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2" exitCode=143 Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281134 4708 generic.go:334] "Generic (PLEG): container finished" podID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" containerID="395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d" exitCode=143 Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281170 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281171 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281280 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281299 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281314 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281327 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281341 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281354 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281366 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281375 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281380 4708 scope.go:117] "RemoveContainer" containerID="81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281384 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281457 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281467 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281474 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281481 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281488 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281501 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281514 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281521 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281528 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281533 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281539 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281546 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281553 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281559 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281565 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281573 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281583 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281595 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281603 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281616 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281623 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281630 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281637 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281645 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281652 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281658 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281681 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281693 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rcmhv" event={"ID":"079cc7a0-ceb7-4921-b022-bbe67ae0fad5","Type":"ContainerDied","Data":"dda90ce3aeec8c31d026f5fe2271d3ced123bdc26d8194deacf44f8f8cf9513d"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281718 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281726 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281733 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281741 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281748 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281755 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281761 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281769 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281776 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.281783 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.283818 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8kspl_f49a68df-98d0-464f-b40e-0aba2faab528/kube-multus/2.log" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.284620 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8kspl_f49a68df-98d0-464f-b40e-0aba2faab528/kube-multus/1.log" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.284664 4708 generic.go:334] "Generic (PLEG): container finished" podID="f49a68df-98d0-464f-b40e-0aba2faab528" containerID="6194e56dac24e25230c92b6148f9a7bc07ff22fcbb4939993823336fbeddcc7a" exitCode=2 Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.284710 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8kspl" event={"ID":"f49a68df-98d0-464f-b40e-0aba2faab528","Type":"ContainerDied","Data":"6194e56dac24e25230c92b6148f9a7bc07ff22fcbb4939993823336fbeddcc7a"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.284732 4708 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b"} Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.285171 4708 scope.go:117] "RemoveContainer" containerID="6194e56dac24e25230c92b6148f9a7bc07ff22fcbb4939993823336fbeddcc7a" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.285333 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8kspl_openshift-multus(f49a68df-98d0-464f-b40e-0aba2faab528)\"" pod="openshift-multus/multus-8kspl" podUID="f49a68df-98d0-464f-b40e-0aba2faab528" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.298645 4708 scope.go:117] "RemoveContainer" containerID="ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.312588 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-tgmps" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.325141 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rcmhv"] Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.326884 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rcmhv"] Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.337398 4708 scope.go:117] "RemoveContainer" containerID="d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.347967 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-cni-netd\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348021 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-node-log\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348049 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsggb\" (UniqueName: \"kubernetes.io/projected/6913edfc-f3a6-4249-bdaf-f13baa815fcf-kube-api-access-rsggb\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348075 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-run-ovn\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348097 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348122 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-kubelet\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348148 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-run-netns\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348172 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6913edfc-f3a6-4249-bdaf-f13baa815fcf-ovnkube-config\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348194 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6913edfc-f3a6-4249-bdaf-f13baa815fcf-ovn-node-metrics-cert\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348228 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-var-lib-openvswitch\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348256 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6913edfc-f3a6-4249-bdaf-f13baa815fcf-ovnkube-script-lib\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348284 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-etc-openvswitch\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348323 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-slash\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348359 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-systemd-units\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348380 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-run-openvswitch\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348405 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-run-systemd\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348425 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6913edfc-f3a6-4249-bdaf-f13baa815fcf-env-overrides\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348456 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-run-ovn-kubernetes\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348476 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-log-socket\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348507 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-cni-bin\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348581 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-cni-bin\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348625 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-cni-netd\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348653 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-node-log\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.348971 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-run-ovn\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.349006 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.349036 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-kubelet\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.349062 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-run-netns\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.350294 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6913edfc-f3a6-4249-bdaf-f13baa815fcf-ovnkube-config\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.350356 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-slash\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.351005 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-run-systemd\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.351025 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-systemd-units\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.351044 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-host-run-ovn-kubernetes\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.351058 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-run-openvswitch\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.351083 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-var-lib-openvswitch\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.351114 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-log-socket\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.351455 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6913edfc-f3a6-4249-bdaf-f13baa815fcf-env-overrides\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.351489 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6913edfc-f3a6-4249-bdaf-f13baa815fcf-etc-openvswitch\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.351910 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6913edfc-f3a6-4249-bdaf-f13baa815fcf-ovnkube-script-lib\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.352330 4708 scope.go:117] "RemoveContainer" containerID="52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.354835 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6913edfc-f3a6-4249-bdaf-f13baa815fcf-ovn-node-metrics-cert\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.366313 4708 scope.go:117] "RemoveContainer" containerID="1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.369851 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsggb\" (UniqueName: \"kubernetes.io/projected/6913edfc-f3a6-4249-bdaf-f13baa815fcf-kube-api-access-rsggb\") pod \"ovnkube-node-9c9pr\" (UID: \"6913edfc-f3a6-4249-bdaf-f13baa815fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.380730 4708 scope.go:117] "RemoveContainer" containerID="38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.397625 4708 scope.go:117] "RemoveContainer" containerID="533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.411024 4708 scope.go:117] "RemoveContainer" containerID="a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.424258 4708 scope.go:117] "RemoveContainer" containerID="395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.437487 4708 scope.go:117] "RemoveContainer" containerID="96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.448319 4708 scope.go:117] "RemoveContainer" containerID="81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.448725 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c\": container with ID starting with 81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c not found: ID does not exist" containerID="81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.448767 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c"} err="failed to get container status \"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c\": rpc error: code = NotFound desc = could not find container \"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c\": container with ID starting with 81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.448793 4708 scope.go:117] "RemoveContainer" containerID="ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.449088 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\": container with ID starting with ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c not found: ID does not exist" containerID="ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.449137 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c"} err="failed to get container status \"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\": rpc error: code = NotFound desc = could not find container \"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\": container with ID starting with ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.449170 4708 scope.go:117] "RemoveContainer" containerID="d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.449438 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\": container with ID starting with d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2 not found: ID does not exist" containerID="d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.449463 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2"} err="failed to get container status \"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\": rpc error: code = NotFound desc = could not find container \"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\": container with ID starting with d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.449477 4708 scope.go:117] "RemoveContainer" containerID="52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.449723 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\": container with ID starting with 52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84 not found: ID does not exist" containerID="52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.449749 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84"} err="failed to get container status \"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\": rpc error: code = NotFound desc = could not find container \"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\": container with ID starting with 52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.449765 4708 scope.go:117] "RemoveContainer" containerID="1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.450021 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\": container with ID starting with 1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79 not found: ID does not exist" containerID="1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.450048 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79"} err="failed to get container status \"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\": rpc error: code = NotFound desc = could not find container \"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\": container with ID starting with 1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.450062 4708 scope.go:117] "RemoveContainer" containerID="38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.450279 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\": container with ID starting with 38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8 not found: ID does not exist" containerID="38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.450304 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8"} err="failed to get container status \"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\": rpc error: code = NotFound desc = could not find container \"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\": container with ID starting with 38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.450317 4708 scope.go:117] "RemoveContainer" containerID="533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.450545 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\": container with ID starting with 533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4 not found: ID does not exist" containerID="533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.450567 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4"} err="failed to get container status \"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\": rpc error: code = NotFound desc = could not find container \"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\": container with ID starting with 533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.450579 4708 scope.go:117] "RemoveContainer" containerID="a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.450797 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\": container with ID starting with a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2 not found: ID does not exist" containerID="a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.450852 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2"} err="failed to get container status \"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\": rpc error: code = NotFound desc = could not find container \"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\": container with ID starting with a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.450867 4708 scope.go:117] "RemoveContainer" containerID="395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.451100 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\": container with ID starting with 395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d not found: ID does not exist" containerID="395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.451131 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d"} err="failed to get container status \"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\": rpc error: code = NotFound desc = could not find container \"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\": container with ID starting with 395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.451150 4708 scope.go:117] "RemoveContainer" containerID="96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2" Mar 20 16:13:59 crc kubenswrapper[4708]: E0320 16:13:59.451359 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\": container with ID starting with 96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2 not found: ID does not exist" containerID="96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.451381 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2"} err="failed to get container status \"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\": rpc error: code = NotFound desc = could not find container \"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\": container with ID starting with 96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.451395 4708 scope.go:117] "RemoveContainer" containerID="81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.451587 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c"} err="failed to get container status \"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c\": rpc error: code = NotFound desc = could not find container \"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c\": container with ID starting with 81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.451611 4708 scope.go:117] "RemoveContainer" containerID="ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.451818 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c"} err="failed to get container status \"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\": rpc error: code = NotFound desc = could not find container \"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\": container with ID starting with ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.451835 4708 scope.go:117] "RemoveContainer" containerID="d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.452009 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2"} err="failed to get container status \"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\": rpc error: code = NotFound desc = could not find container \"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\": container with ID starting with d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.452029 4708 scope.go:117] "RemoveContainer" containerID="52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.452214 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84"} err="failed to get container status \"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\": rpc error: code = NotFound desc = could not find container \"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\": container with ID starting with 52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.452239 4708 scope.go:117] "RemoveContainer" containerID="1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.452468 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79"} err="failed to get container status \"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\": rpc error: code = NotFound desc = could not find container \"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\": container with ID starting with 1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.452486 4708 scope.go:117] "RemoveContainer" containerID="38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.452708 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8"} err="failed to get container status \"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\": rpc error: code = NotFound desc = could not find container \"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\": container with ID starting with 38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.452730 4708 scope.go:117] "RemoveContainer" containerID="533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.452936 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4"} err="failed to get container status \"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\": rpc error: code = NotFound desc = could not find container \"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\": container with ID starting with 533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.452963 4708 scope.go:117] "RemoveContainer" containerID="a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.453172 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2"} err="failed to get container status \"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\": rpc error: code = NotFound desc = could not find container \"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\": container with ID starting with a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.453190 4708 scope.go:117] "RemoveContainer" containerID="395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.453375 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d"} err="failed to get container status \"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\": rpc error: code = NotFound desc = could not find container \"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\": container with ID starting with 395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.453393 4708 scope.go:117] "RemoveContainer" containerID="96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.453600 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2"} err="failed to get container status \"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\": rpc error: code = NotFound desc = could not find container \"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\": container with ID starting with 96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.453630 4708 scope.go:117] "RemoveContainer" containerID="81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.453977 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c"} err="failed to get container status \"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c\": rpc error: code = NotFound desc = could not find container \"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c\": container with ID starting with 81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.454020 4708 scope.go:117] "RemoveContainer" containerID="ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.454274 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c"} err="failed to get container status \"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\": rpc error: code = NotFound desc = could not find container \"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\": container with ID starting with ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.454305 4708 scope.go:117] "RemoveContainer" containerID="d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.454517 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2"} err="failed to get container status \"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\": rpc error: code = NotFound desc = could not find container \"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\": container with ID starting with d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.454550 4708 scope.go:117] "RemoveContainer" containerID="52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.454792 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84"} err="failed to get container status \"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\": rpc error: code = NotFound desc = could not find container \"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\": container with ID starting with 52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.454819 4708 scope.go:117] "RemoveContainer" containerID="1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.455117 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79"} err="failed to get container status \"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\": rpc error: code = NotFound desc = could not find container \"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\": container with ID starting with 1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.455144 4708 scope.go:117] "RemoveContainer" containerID="38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.455351 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8"} err="failed to get container status \"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\": rpc error: code = NotFound desc = could not find container \"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\": container with ID starting with 38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.455370 4708 scope.go:117] "RemoveContainer" containerID="533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.455568 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4"} err="failed to get container status \"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\": rpc error: code = NotFound desc = could not find container \"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\": container with ID starting with 533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.455589 4708 scope.go:117] "RemoveContainer" containerID="a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.455856 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2"} err="failed to get container status \"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\": rpc error: code = NotFound desc = could not find container \"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\": container with ID starting with a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.455886 4708 scope.go:117] "RemoveContainer" containerID="395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.456082 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d"} err="failed to get container status \"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\": rpc error: code = NotFound desc = could not find container \"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\": container with ID starting with 395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.456102 4708 scope.go:117] "RemoveContainer" containerID="96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.456290 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2"} err="failed to get container status \"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\": rpc error: code = NotFound desc = could not find container \"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\": container with ID starting with 96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.456310 4708 scope.go:117] "RemoveContainer" containerID="81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.456486 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c"} err="failed to get container status \"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c\": rpc error: code = NotFound desc = could not find container \"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c\": container with ID starting with 81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.456515 4708 scope.go:117] "RemoveContainer" containerID="ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.456798 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c"} err="failed to get container status \"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\": rpc error: code = NotFound desc = could not find container \"ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c\": container with ID starting with ff5012da38d1ddea94b2f78e2cd73aa18ec35209f12a2d83b60c5d52d7e1a96c not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.456820 4708 scope.go:117] "RemoveContainer" containerID="d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.457025 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2"} err="failed to get container status \"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\": rpc error: code = NotFound desc = could not find container \"d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2\": container with ID starting with d34a1f254bb25fa7202cf6c7875bf65d3687ab0146b0cfafa05d7b0c80d17de2 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.457040 4708 scope.go:117] "RemoveContainer" containerID="52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.457289 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84"} err="failed to get container status \"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\": rpc error: code = NotFound desc = could not find container \"52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84\": container with ID starting with 52826f86458de7632de4685c9f418c99caf9b99e2c35ef8bf669d04934975d84 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.457311 4708 scope.go:117] "RemoveContainer" containerID="1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.457579 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79"} err="failed to get container status \"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\": rpc error: code = NotFound desc = could not find container \"1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79\": container with ID starting with 1e5997db7a5cb913dafb27215dcb652d703fb1dddda3d46e5f2e481d59705f79 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.457603 4708 scope.go:117] "RemoveContainer" containerID="38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.457850 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8"} err="failed to get container status \"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\": rpc error: code = NotFound desc = could not find container \"38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8\": container with ID starting with 38520e45bc3a56593c4c578dcaa6a748db2fdff115c1b192cdeab9a07556e8c8 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.457876 4708 scope.go:117] "RemoveContainer" containerID="533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.458090 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4"} err="failed to get container status \"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\": rpc error: code = NotFound desc = could not find container \"533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4\": container with ID starting with 533cc36519ea0c31e89a3d9a7fa1cab89020a728cf04c09306dd8d00e65ceaf4 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.458111 4708 scope.go:117] "RemoveContainer" containerID="a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.458369 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2"} err="failed to get container status \"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\": rpc error: code = NotFound desc = could not find container \"a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2\": container with ID starting with a906f666c0816628bdfc73c08b1246ad4803cc7d0ab44aa075d6e0c6d9382bd2 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.458391 4708 scope.go:117] "RemoveContainer" containerID="395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.458612 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d"} err="failed to get container status \"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\": rpc error: code = NotFound desc = could not find container \"395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d\": container with ID starting with 395d213c19282404d359fb7c8fd5b3c196c905ef68f5f8c4f9b7b37cded14a5d not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.458651 4708 scope.go:117] "RemoveContainer" containerID="96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.458874 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2"} err="failed to get container status \"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\": rpc error: code = NotFound desc = could not find container \"96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2\": container with ID starting with 96776f97ae364b5c06226df5244992de25bb2f3652b9b9bcf266fcf90f203cd2 not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.458894 4708 scope.go:117] "RemoveContainer" containerID="81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.459095 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c"} err="failed to get container status \"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c\": rpc error: code = NotFound desc = could not find container \"81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c\": container with ID starting with 81ff237fab051eeadb2431607fa59db7b85591d3c4af3198efd6fbc34140154c not found: ID does not exist" Mar 20 16:13:59 crc kubenswrapper[4708]: I0320 16:13:59.491851 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.118122 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079cc7a0-ceb7-4921-b022-bbe67ae0fad5" path="/var/lib/kubelet/pods/079cc7a0-ceb7-4921-b022-bbe67ae0fad5/volumes" Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.133063 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567054-5hj5k"] Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.134182 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.136796 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.137117 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.137341 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.155523 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65wc5\" (UniqueName: \"kubernetes.io/projected/160992b9-f53d-47e7-9b7a-1436f5f815e2-kube-api-access-65wc5\") pod \"auto-csr-approver-29567054-5hj5k\" (UID: \"160992b9-f53d-47e7-9b7a-1436f5f815e2\") " pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.256629 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65wc5\" (UniqueName: \"kubernetes.io/projected/160992b9-f53d-47e7-9b7a-1436f5f815e2-kube-api-access-65wc5\") pod \"auto-csr-approver-29567054-5hj5k\" (UID: \"160992b9-f53d-47e7-9b7a-1436f5f815e2\") " pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.275578 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65wc5\" (UniqueName: \"kubernetes.io/projected/160992b9-f53d-47e7-9b7a-1436f5f815e2-kube-api-access-65wc5\") pod \"auto-csr-approver-29567054-5hj5k\" (UID: \"160992b9-f53d-47e7-9b7a-1436f5f815e2\") " pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.290824 4708 generic.go:334] "Generic (PLEG): container finished" podID="6913edfc-f3a6-4249-bdaf-f13baa815fcf" containerID="61bce812b995af79186b34ee7130b530b9170146a043d45228dd7a10f72b8044" exitCode=0 Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.290890 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" event={"ID":"6913edfc-f3a6-4249-bdaf-f13baa815fcf","Type":"ContainerDied","Data":"61bce812b995af79186b34ee7130b530b9170146a043d45228dd7a10f72b8044"} Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.290937 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" event={"ID":"6913edfc-f3a6-4249-bdaf-f13baa815fcf","Type":"ContainerStarted","Data":"d32a07d088588f66e12078a80406bc95316f7aa765c43db6c31295e572d7ab00"} Mar 20 16:14:00 crc kubenswrapper[4708]: I0320 16:14:00.467155 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:00 crc kubenswrapper[4708]: E0320 16:14:00.491602 4708 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567054-5hj5k_openshift-infra_160992b9-f53d-47e7-9b7a-1436f5f815e2_0(b89e6a5019e67d16e0b5fbf5b870f7828bc421c31c856724ad83c2c8c19dd14c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:14:00 crc kubenswrapper[4708]: E0320 16:14:00.491706 4708 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567054-5hj5k_openshift-infra_160992b9-f53d-47e7-9b7a-1436f5f815e2_0(b89e6a5019e67d16e0b5fbf5b870f7828bc421c31c856724ad83c2c8c19dd14c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:00 crc kubenswrapper[4708]: E0320 16:14:00.491729 4708 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567054-5hj5k_openshift-infra_160992b9-f53d-47e7-9b7a-1436f5f815e2_0(b89e6a5019e67d16e0b5fbf5b870f7828bc421c31c856724ad83c2c8c19dd14c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:00 crc kubenswrapper[4708]: E0320 16:14:00.491782 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29567054-5hj5k_openshift-infra(160992b9-f53d-47e7-9b7a-1436f5f815e2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29567054-5hj5k_openshift-infra(160992b9-f53d-47e7-9b7a-1436f5f815e2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567054-5hj5k_openshift-infra_160992b9-f53d-47e7-9b7a-1436f5f815e2_0(b89e6a5019e67d16e0b5fbf5b870f7828bc421c31c856724ad83c2c8c19dd14c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" podUID="160992b9-f53d-47e7-9b7a-1436f5f815e2" Mar 20 16:14:01 crc kubenswrapper[4708]: I0320 16:14:01.300072 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" event={"ID":"6913edfc-f3a6-4249-bdaf-f13baa815fcf","Type":"ContainerStarted","Data":"c6fcd2591a59933c52045614983bec457c6cd58d393e990c3f5f0c523439816c"} Mar 20 16:14:01 crc kubenswrapper[4708]: I0320 16:14:01.300393 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" event={"ID":"6913edfc-f3a6-4249-bdaf-f13baa815fcf","Type":"ContainerStarted","Data":"7029bd30e8179ab57bd17d576432cb9315061ae0615d9f4d3362a254becb86a2"} Mar 20 16:14:01 crc kubenswrapper[4708]: I0320 16:14:01.300405 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" event={"ID":"6913edfc-f3a6-4249-bdaf-f13baa815fcf","Type":"ContainerStarted","Data":"a03f021efb62a39beae748e9d6ff59eca41fef1ecdd0f33c3791c9f528d8bac8"} Mar 20 16:14:01 crc kubenswrapper[4708]: I0320 16:14:01.300414 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" event={"ID":"6913edfc-f3a6-4249-bdaf-f13baa815fcf","Type":"ContainerStarted","Data":"520510243f778104f7e984c4136d675de1c7518461532131523a008c3f8c2ba3"} Mar 20 16:14:01 crc kubenswrapper[4708]: I0320 16:14:01.300422 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" event={"ID":"6913edfc-f3a6-4249-bdaf-f13baa815fcf","Type":"ContainerStarted","Data":"e5d3dcd722cd313508a13f975be367794565530674eb3032cd5dca21e6f0ad95"} Mar 20 16:14:01 crc kubenswrapper[4708]: I0320 16:14:01.300433 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" event={"ID":"6913edfc-f3a6-4249-bdaf-f13baa815fcf","Type":"ContainerStarted","Data":"cdd9287410561b6de2061ef35258c3cb979345595e7398074759c49de36a0fca"} Mar 20 16:14:03 crc kubenswrapper[4708]: I0320 16:14:03.313611 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" event={"ID":"6913edfc-f3a6-4249-bdaf-f13baa815fcf","Type":"ContainerStarted","Data":"e5bdfc7ca00ded0df7f5ddd09b17c0c71de0f9c88080497296d2fc85bf59db6b"} Mar 20 16:14:06 crc kubenswrapper[4708]: I0320 16:14:06.176291 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-5hj5k"] Mar 20 16:14:06 crc kubenswrapper[4708]: I0320 16:14:06.176961 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:06 crc kubenswrapper[4708]: I0320 16:14:06.177327 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:06 crc kubenswrapper[4708]: E0320 16:14:06.207227 4708 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567054-5hj5k_openshift-infra_160992b9-f53d-47e7-9b7a-1436f5f815e2_0(de8aa9a1ad02a3d42d6f9a637aa5c9dde07379664a81aa8032a45a10d0e3d591): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:14:06 crc kubenswrapper[4708]: E0320 16:14:06.207299 4708 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567054-5hj5k_openshift-infra_160992b9-f53d-47e7-9b7a-1436f5f815e2_0(de8aa9a1ad02a3d42d6f9a637aa5c9dde07379664a81aa8032a45a10d0e3d591): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:06 crc kubenswrapper[4708]: E0320 16:14:06.207319 4708 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567054-5hj5k_openshift-infra_160992b9-f53d-47e7-9b7a-1436f5f815e2_0(de8aa9a1ad02a3d42d6f9a637aa5c9dde07379664a81aa8032a45a10d0e3d591): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:06 crc kubenswrapper[4708]: E0320 16:14:06.207390 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29567054-5hj5k_openshift-infra(160992b9-f53d-47e7-9b7a-1436f5f815e2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29567054-5hj5k_openshift-infra(160992b9-f53d-47e7-9b7a-1436f5f815e2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567054-5hj5k_openshift-infra_160992b9-f53d-47e7-9b7a-1436f5f815e2_0(de8aa9a1ad02a3d42d6f9a637aa5c9dde07379664a81aa8032a45a10d0e3d591): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" podUID="160992b9-f53d-47e7-9b7a-1436f5f815e2" Mar 20 16:14:06 crc kubenswrapper[4708]: I0320 16:14:06.335730 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" event={"ID":"6913edfc-f3a6-4249-bdaf-f13baa815fcf","Type":"ContainerStarted","Data":"c9fb26a1c8542856473f67e50f76f8047c12b14827fce81e562a06ae4e14267c"} Mar 20 16:14:06 crc kubenswrapper[4708]: I0320 16:14:06.336038 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:14:06 crc kubenswrapper[4708]: I0320 16:14:06.336086 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:14:06 crc kubenswrapper[4708]: I0320 16:14:06.369103 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" podStartSLOduration=7.369085703 podStartE2EDuration="7.369085703s" podCreationTimestamp="2026-03-20 16:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:14:06.366063981 +0000 UTC m=+801.040400716" watchObservedRunningTime="2026-03-20 16:14:06.369085703 +0000 UTC m=+801.043422428" Mar 20 16:14:06 crc kubenswrapper[4708]: I0320 16:14:06.384235 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:14:07 crc kubenswrapper[4708]: I0320 16:14:07.340877 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:14:07 crc kubenswrapper[4708]: I0320 16:14:07.365231 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:14:11 crc kubenswrapper[4708]: I0320 16:14:11.102963 4708 scope.go:117] "RemoveContainer" containerID="2d7374793212b286a5c1962f24ab6506d6d15c0e27f9c9ad16ac05e424fb0b5b" Mar 20 16:14:11 crc kubenswrapper[4708]: I0320 16:14:11.112986 4708 scope.go:117] "RemoveContainer" containerID="6194e56dac24e25230c92b6148f9a7bc07ff22fcbb4939993823336fbeddcc7a" Mar 20 16:14:11 crc kubenswrapper[4708]: E0320 16:14:11.115104 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8kspl_openshift-multus(f49a68df-98d0-464f-b40e-0aba2faab528)\"" pod="openshift-multus/multus-8kspl" podUID="f49a68df-98d0-464f-b40e-0aba2faab528" Mar 20 16:14:11 crc kubenswrapper[4708]: I0320 16:14:11.363759 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8kspl_f49a68df-98d0-464f-b40e-0aba2faab528/kube-multus/2.log" Mar 20 16:14:18 crc kubenswrapper[4708]: I0320 16:14:18.110904 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:18 crc kubenswrapper[4708]: I0320 16:14:18.112201 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:18 crc kubenswrapper[4708]: E0320 16:14:18.145130 4708 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567054-5hj5k_openshift-infra_160992b9-f53d-47e7-9b7a-1436f5f815e2_0(31f189d5622937517a2d55c386255da56112ef9f4624a2b6bf11fedfb2b9cd94): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 16:14:18 crc kubenswrapper[4708]: E0320 16:14:18.145192 4708 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567054-5hj5k_openshift-infra_160992b9-f53d-47e7-9b7a-1436f5f815e2_0(31f189d5622937517a2d55c386255da56112ef9f4624a2b6bf11fedfb2b9cd94): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:18 crc kubenswrapper[4708]: E0320 16:14:18.145212 4708 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567054-5hj5k_openshift-infra_160992b9-f53d-47e7-9b7a-1436f5f815e2_0(31f189d5622937517a2d55c386255da56112ef9f4624a2b6bf11fedfb2b9cd94): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:18 crc kubenswrapper[4708]: E0320 16:14:18.145257 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29567054-5hj5k_openshift-infra(160992b9-f53d-47e7-9b7a-1436f5f815e2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29567054-5hj5k_openshift-infra(160992b9-f53d-47e7-9b7a-1436f5f815e2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567054-5hj5k_openshift-infra_160992b9-f53d-47e7-9b7a-1436f5f815e2_0(31f189d5622937517a2d55c386255da56112ef9f4624a2b6bf11fedfb2b9cd94): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" podUID="160992b9-f53d-47e7-9b7a-1436f5f815e2" Mar 20 16:14:25 crc kubenswrapper[4708]: I0320 16:14:25.110593 4708 scope.go:117] "RemoveContainer" containerID="6194e56dac24e25230c92b6148f9a7bc07ff22fcbb4939993823336fbeddcc7a" Mar 20 16:14:26 crc kubenswrapper[4708]: I0320 16:14:26.179367 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:14:26 crc kubenswrapper[4708]: I0320 16:14:26.179980 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:14:26 crc kubenswrapper[4708]: I0320 16:14:26.461139 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8kspl_f49a68df-98d0-464f-b40e-0aba2faab528/kube-multus/2.log" Mar 20 16:14:26 crc kubenswrapper[4708]: I0320 16:14:26.461423 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8kspl" event={"ID":"f49a68df-98d0-464f-b40e-0aba2faab528","Type":"ContainerStarted","Data":"94c1b1414c095326e42120bb769dd7c28e8bd6a221123dcebb3a4015f11f22ac"} Mar 20 16:14:29 crc kubenswrapper[4708]: I0320 16:14:29.518968 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9c9pr" Mar 20 16:14:30 crc kubenswrapper[4708]: I0320 16:14:30.110073 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:30 crc kubenswrapper[4708]: I0320 16:14:30.111086 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:30 crc kubenswrapper[4708]: I0320 16:14:30.523957 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-5hj5k"] Mar 20 16:14:31 crc kubenswrapper[4708]: I0320 16:14:31.492929 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" event={"ID":"160992b9-f53d-47e7-9b7a-1436f5f815e2","Type":"ContainerStarted","Data":"e05150de3b115bdbe1ae044c57c2656604729456d7210319c75a58448941da62"} Mar 20 16:14:34 crc kubenswrapper[4708]: I0320 16:14:34.746786 4708 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 16:14:35 crc kubenswrapper[4708]: I0320 16:14:35.526030 4708 generic.go:334] "Generic (PLEG): container finished" podID="160992b9-f53d-47e7-9b7a-1436f5f815e2" containerID="33f4fa4a7b7372a5c55fead11f4ec4eded6db7d99e0aec9664fbcfe988e06cc3" exitCode=0 Mar 20 16:14:35 crc kubenswrapper[4708]: I0320 16:14:35.526369 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" event={"ID":"160992b9-f53d-47e7-9b7a-1436f5f815e2","Type":"ContainerDied","Data":"33f4fa4a7b7372a5c55fead11f4ec4eded6db7d99e0aec9664fbcfe988e06cc3"} Mar 20 16:14:36 crc kubenswrapper[4708]: I0320 16:14:36.791816 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:36 crc kubenswrapper[4708]: I0320 16:14:36.885579 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65wc5\" (UniqueName: \"kubernetes.io/projected/160992b9-f53d-47e7-9b7a-1436f5f815e2-kube-api-access-65wc5\") pod \"160992b9-f53d-47e7-9b7a-1436f5f815e2\" (UID: \"160992b9-f53d-47e7-9b7a-1436f5f815e2\") " Mar 20 16:14:36 crc kubenswrapper[4708]: I0320 16:14:36.891691 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/160992b9-f53d-47e7-9b7a-1436f5f815e2-kube-api-access-65wc5" (OuterVolumeSpecName: "kube-api-access-65wc5") pod "160992b9-f53d-47e7-9b7a-1436f5f815e2" (UID: "160992b9-f53d-47e7-9b7a-1436f5f815e2"). InnerVolumeSpecName "kube-api-access-65wc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:14:36 crc kubenswrapper[4708]: I0320 16:14:36.986995 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65wc5\" (UniqueName: \"kubernetes.io/projected/160992b9-f53d-47e7-9b7a-1436f5f815e2-kube-api-access-65wc5\") on node \"crc\" DevicePath \"\"" Mar 20 16:14:37 crc kubenswrapper[4708]: I0320 16:14:37.550059 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" event={"ID":"160992b9-f53d-47e7-9b7a-1436f5f815e2","Type":"ContainerDied","Data":"e05150de3b115bdbe1ae044c57c2656604729456d7210319c75a58448941da62"} Mar 20 16:14:37 crc kubenswrapper[4708]: I0320 16:14:37.550110 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e05150de3b115bdbe1ae044c57c2656604729456d7210319c75a58448941da62" Mar 20 16:14:37 crc kubenswrapper[4708]: I0320 16:14:37.550122 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-5hj5k" Mar 20 16:14:37 crc kubenswrapper[4708]: I0320 16:14:37.856776 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-gwjs8"] Mar 20 16:14:37 crc kubenswrapper[4708]: I0320 16:14:37.867374 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-gwjs8"] Mar 20 16:14:38 crc kubenswrapper[4708]: I0320 16:14:38.116878 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba792d3-1a9c-4aa8-929f-fa66ba85a87a" path="/var/lib/kubelet/pods/6ba792d3-1a9c-4aa8-929f-fa66ba85a87a/volumes" Mar 20 16:14:39 crc kubenswrapper[4708]: I0320 16:14:39.970091 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m"] Mar 20 16:14:39 crc kubenswrapper[4708]: E0320 16:14:39.970786 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="160992b9-f53d-47e7-9b7a-1436f5f815e2" containerName="oc" Mar 20 16:14:39 crc kubenswrapper[4708]: I0320 16:14:39.970802 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="160992b9-f53d-47e7-9b7a-1436f5f815e2" containerName="oc" Mar 20 16:14:39 crc kubenswrapper[4708]: I0320 16:14:39.970922 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="160992b9-f53d-47e7-9b7a-1436f5f815e2" containerName="oc" Mar 20 16:14:39 crc kubenswrapper[4708]: I0320 16:14:39.971759 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:39 crc kubenswrapper[4708]: I0320 16:14:39.974333 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 16:14:39 crc kubenswrapper[4708]: I0320 16:14:39.980751 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m"] Mar 20 16:14:40 crc kubenswrapper[4708]: I0320 16:14:40.123238 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvd8w\" (UniqueName: \"kubernetes.io/projected/4b39784f-91ed-47c3-a778-3bd4f77ca786-kube-api-access-rvd8w\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m\" (UID: \"4b39784f-91ed-47c3-a778-3bd4f77ca786\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:40 crc kubenswrapper[4708]: I0320 16:14:40.123385 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b39784f-91ed-47c3-a778-3bd4f77ca786-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m\" (UID: \"4b39784f-91ed-47c3-a778-3bd4f77ca786\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:40 crc kubenswrapper[4708]: I0320 16:14:40.123478 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b39784f-91ed-47c3-a778-3bd4f77ca786-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m\" (UID: \"4b39784f-91ed-47c3-a778-3bd4f77ca786\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:40 crc kubenswrapper[4708]: I0320 16:14:40.224609 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b39784f-91ed-47c3-a778-3bd4f77ca786-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m\" (UID: \"4b39784f-91ed-47c3-a778-3bd4f77ca786\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:40 crc kubenswrapper[4708]: I0320 16:14:40.224728 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvd8w\" (UniqueName: \"kubernetes.io/projected/4b39784f-91ed-47c3-a778-3bd4f77ca786-kube-api-access-rvd8w\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m\" (UID: \"4b39784f-91ed-47c3-a778-3bd4f77ca786\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:40 crc kubenswrapper[4708]: I0320 16:14:40.224759 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b39784f-91ed-47c3-a778-3bd4f77ca786-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m\" (UID: \"4b39784f-91ed-47c3-a778-3bd4f77ca786\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:40 crc kubenswrapper[4708]: I0320 16:14:40.225122 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b39784f-91ed-47c3-a778-3bd4f77ca786-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m\" (UID: \"4b39784f-91ed-47c3-a778-3bd4f77ca786\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:40 crc kubenswrapper[4708]: I0320 16:14:40.225163 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b39784f-91ed-47c3-a778-3bd4f77ca786-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m\" (UID: \"4b39784f-91ed-47c3-a778-3bd4f77ca786\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:40 crc kubenswrapper[4708]: I0320 16:14:40.245882 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvd8w\" (UniqueName: \"kubernetes.io/projected/4b39784f-91ed-47c3-a778-3bd4f77ca786-kube-api-access-rvd8w\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m\" (UID: \"4b39784f-91ed-47c3-a778-3bd4f77ca786\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:40 crc kubenswrapper[4708]: I0320 16:14:40.289006 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:40 crc kubenswrapper[4708]: I0320 16:14:40.728793 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m"] Mar 20 16:14:41 crc kubenswrapper[4708]: I0320 16:14:41.579115 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" event={"ID":"4b39784f-91ed-47c3-a778-3bd4f77ca786","Type":"ContainerStarted","Data":"9eb3acd39c1afb52ee41c1bf384847bae413358baf5c58f101cf59a766cc03bf"} Mar 20 16:14:41 crc kubenswrapper[4708]: I0320 16:14:41.579380 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" event={"ID":"4b39784f-91ed-47c3-a778-3bd4f77ca786","Type":"ContainerStarted","Data":"1497a9d3f41cc7fd05566f0d5aa9972c81c5b792ac60c114f5f717e37e0fe291"} Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.181657 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kbp2n"] Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.182993 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.202959 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kbp2n"] Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.353860 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55865e51-8124-4212-921c-3d892fb54901-catalog-content\") pod \"redhat-operators-kbp2n\" (UID: \"55865e51-8124-4212-921c-3d892fb54901\") " pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.353922 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55865e51-8124-4212-921c-3d892fb54901-utilities\") pod \"redhat-operators-kbp2n\" (UID: \"55865e51-8124-4212-921c-3d892fb54901\") " pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.353958 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25jj8\" (UniqueName: \"kubernetes.io/projected/55865e51-8124-4212-921c-3d892fb54901-kube-api-access-25jj8\") pod \"redhat-operators-kbp2n\" (UID: \"55865e51-8124-4212-921c-3d892fb54901\") " pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.454793 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55865e51-8124-4212-921c-3d892fb54901-catalog-content\") pod \"redhat-operators-kbp2n\" (UID: \"55865e51-8124-4212-921c-3d892fb54901\") " pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.454850 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55865e51-8124-4212-921c-3d892fb54901-utilities\") pod \"redhat-operators-kbp2n\" (UID: \"55865e51-8124-4212-921c-3d892fb54901\") " pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.454883 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25jj8\" (UniqueName: \"kubernetes.io/projected/55865e51-8124-4212-921c-3d892fb54901-kube-api-access-25jj8\") pod \"redhat-operators-kbp2n\" (UID: \"55865e51-8124-4212-921c-3d892fb54901\") " pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.455289 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55865e51-8124-4212-921c-3d892fb54901-catalog-content\") pod \"redhat-operators-kbp2n\" (UID: \"55865e51-8124-4212-921c-3d892fb54901\") " pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.455407 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55865e51-8124-4212-921c-3d892fb54901-utilities\") pod \"redhat-operators-kbp2n\" (UID: \"55865e51-8124-4212-921c-3d892fb54901\") " pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.478737 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25jj8\" (UniqueName: \"kubernetes.io/projected/55865e51-8124-4212-921c-3d892fb54901-kube-api-access-25jj8\") pod \"redhat-operators-kbp2n\" (UID: \"55865e51-8124-4212-921c-3d892fb54901\") " pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.515412 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:42 crc kubenswrapper[4708]: I0320 16:14:42.746684 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kbp2n"] Mar 20 16:14:43 crc kubenswrapper[4708]: I0320 16:14:43.593480 4708 generic.go:334] "Generic (PLEG): container finished" podID="55865e51-8124-4212-921c-3d892fb54901" containerID="977ec6f3dd72ceb7127194c8d7714b7d581c563bf40074a66bb0e6334e14a9f9" exitCode=0 Mar 20 16:14:43 crc kubenswrapper[4708]: I0320 16:14:43.593763 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbp2n" event={"ID":"55865e51-8124-4212-921c-3d892fb54901","Type":"ContainerDied","Data":"977ec6f3dd72ceb7127194c8d7714b7d581c563bf40074a66bb0e6334e14a9f9"} Mar 20 16:14:43 crc kubenswrapper[4708]: I0320 16:14:43.594504 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbp2n" event={"ID":"55865e51-8124-4212-921c-3d892fb54901","Type":"ContainerStarted","Data":"aef55e9f9a3f6993978e9f77679c1245819ff517dc31c982b959fa6ad1fa501e"} Mar 20 16:14:43 crc kubenswrapper[4708]: I0320 16:14:43.596463 4708 generic.go:334] "Generic (PLEG): container finished" podID="4b39784f-91ed-47c3-a778-3bd4f77ca786" containerID="9eb3acd39c1afb52ee41c1bf384847bae413358baf5c58f101cf59a766cc03bf" exitCode=0 Mar 20 16:14:43 crc kubenswrapper[4708]: I0320 16:14:43.596545 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" event={"ID":"4b39784f-91ed-47c3-a778-3bd4f77ca786","Type":"ContainerDied","Data":"9eb3acd39c1afb52ee41c1bf384847bae413358baf5c58f101cf59a766cc03bf"} Mar 20 16:14:45 crc kubenswrapper[4708]: I0320 16:14:45.614051 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbp2n" event={"ID":"55865e51-8124-4212-921c-3d892fb54901","Type":"ContainerStarted","Data":"0ac298f47802dac5ddf9d5b20cc1d623616c66b4c02ba0a92fe16f61817a56de"} Mar 20 16:14:46 crc kubenswrapper[4708]: I0320 16:14:46.620598 4708 generic.go:334] "Generic (PLEG): container finished" podID="55865e51-8124-4212-921c-3d892fb54901" containerID="0ac298f47802dac5ddf9d5b20cc1d623616c66b4c02ba0a92fe16f61817a56de" exitCode=0 Mar 20 16:14:46 crc kubenswrapper[4708]: I0320 16:14:46.620656 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbp2n" event={"ID":"55865e51-8124-4212-921c-3d892fb54901","Type":"ContainerDied","Data":"0ac298f47802dac5ddf9d5b20cc1d623616c66b4c02ba0a92fe16f61817a56de"} Mar 20 16:14:49 crc kubenswrapper[4708]: I0320 16:14:49.647663 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbp2n" event={"ID":"55865e51-8124-4212-921c-3d892fb54901","Type":"ContainerStarted","Data":"43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957"} Mar 20 16:14:49 crc kubenswrapper[4708]: I0320 16:14:49.649589 4708 generic.go:334] "Generic (PLEG): container finished" podID="4b39784f-91ed-47c3-a778-3bd4f77ca786" containerID="7cfcfa0ebc0725dd9041833d74e11dac3111131454dcd5e1257f02316db34dcb" exitCode=0 Mar 20 16:14:49 crc kubenswrapper[4708]: I0320 16:14:49.649637 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" event={"ID":"4b39784f-91ed-47c3-a778-3bd4f77ca786","Type":"ContainerDied","Data":"7cfcfa0ebc0725dd9041833d74e11dac3111131454dcd5e1257f02316db34dcb"} Mar 20 16:14:49 crc kubenswrapper[4708]: I0320 16:14:49.673281 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kbp2n" podStartSLOduration=2.69204696 podStartE2EDuration="7.673252382s" podCreationTimestamp="2026-03-20 16:14:42 +0000 UTC" firstStartedPulling="2026-03-20 16:14:43.595348015 +0000 UTC m=+838.269684730" lastFinishedPulling="2026-03-20 16:14:48.576553427 +0000 UTC m=+843.250890152" observedRunningTime="2026-03-20 16:14:49.667030542 +0000 UTC m=+844.341367287" watchObservedRunningTime="2026-03-20 16:14:49.673252382 +0000 UTC m=+844.347589117" Mar 20 16:14:50 crc kubenswrapper[4708]: I0320 16:14:50.667869 4708 generic.go:334] "Generic (PLEG): container finished" podID="4b39784f-91ed-47c3-a778-3bd4f77ca786" containerID="fc0d4fe6d3f430db003727624313e78f0050ca6e0af10a440d2d79209f18e8f9" exitCode=0 Mar 20 16:14:50 crc kubenswrapper[4708]: I0320 16:14:50.667978 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" event={"ID":"4b39784f-91ed-47c3-a778-3bd4f77ca786","Type":"ContainerDied","Data":"fc0d4fe6d3f430db003727624313e78f0050ca6e0af10a440d2d79209f18e8f9"} Mar 20 16:14:51 crc kubenswrapper[4708]: I0320 16:14:51.899954 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:51 crc kubenswrapper[4708]: I0320 16:14:51.984259 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b39784f-91ed-47c3-a778-3bd4f77ca786-bundle\") pod \"4b39784f-91ed-47c3-a778-3bd4f77ca786\" (UID: \"4b39784f-91ed-47c3-a778-3bd4f77ca786\") " Mar 20 16:14:51 crc kubenswrapper[4708]: I0320 16:14:51.984325 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvd8w\" (UniqueName: \"kubernetes.io/projected/4b39784f-91ed-47c3-a778-3bd4f77ca786-kube-api-access-rvd8w\") pod \"4b39784f-91ed-47c3-a778-3bd4f77ca786\" (UID: \"4b39784f-91ed-47c3-a778-3bd4f77ca786\") " Mar 20 16:14:51 crc kubenswrapper[4708]: I0320 16:14:51.984359 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b39784f-91ed-47c3-a778-3bd4f77ca786-util\") pod \"4b39784f-91ed-47c3-a778-3bd4f77ca786\" (UID: \"4b39784f-91ed-47c3-a778-3bd4f77ca786\") " Mar 20 16:14:51 crc kubenswrapper[4708]: I0320 16:14:51.985255 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b39784f-91ed-47c3-a778-3bd4f77ca786-bundle" (OuterVolumeSpecName: "bundle") pod "4b39784f-91ed-47c3-a778-3bd4f77ca786" (UID: "4b39784f-91ed-47c3-a778-3bd4f77ca786"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:14:51 crc kubenswrapper[4708]: I0320 16:14:51.992115 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b39784f-91ed-47c3-a778-3bd4f77ca786-kube-api-access-rvd8w" (OuterVolumeSpecName: "kube-api-access-rvd8w") pod "4b39784f-91ed-47c3-a778-3bd4f77ca786" (UID: "4b39784f-91ed-47c3-a778-3bd4f77ca786"). InnerVolumeSpecName "kube-api-access-rvd8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:14:51 crc kubenswrapper[4708]: I0320 16:14:51.995310 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b39784f-91ed-47c3-a778-3bd4f77ca786-util" (OuterVolumeSpecName: "util") pod "4b39784f-91ed-47c3-a778-3bd4f77ca786" (UID: "4b39784f-91ed-47c3-a778-3bd4f77ca786"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:14:52 crc kubenswrapper[4708]: I0320 16:14:52.085434 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvd8w\" (UniqueName: \"kubernetes.io/projected/4b39784f-91ed-47c3-a778-3bd4f77ca786-kube-api-access-rvd8w\") on node \"crc\" DevicePath \"\"" Mar 20 16:14:52 crc kubenswrapper[4708]: I0320 16:14:52.085468 4708 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b39784f-91ed-47c3-a778-3bd4f77ca786-util\") on node \"crc\" DevicePath \"\"" Mar 20 16:14:52 crc kubenswrapper[4708]: I0320 16:14:52.085477 4708 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b39784f-91ed-47c3-a778-3bd4f77ca786-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:14:52 crc kubenswrapper[4708]: I0320 16:14:52.516142 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:52 crc kubenswrapper[4708]: I0320 16:14:52.516889 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:14:52 crc kubenswrapper[4708]: I0320 16:14:52.683199 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" event={"ID":"4b39784f-91ed-47c3-a778-3bd4f77ca786","Type":"ContainerDied","Data":"1497a9d3f41cc7fd05566f0d5aa9972c81c5b792ac60c114f5f717e37e0fe291"} Mar 20 16:14:52 crc kubenswrapper[4708]: I0320 16:14:52.683258 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1497a9d3f41cc7fd05566f0d5aa9972c81c5b792ac60c114f5f717e37e0fe291" Mar 20 16:14:52 crc kubenswrapper[4708]: I0320 16:14:52.683329 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m" Mar 20 16:14:53 crc kubenswrapper[4708]: I0320 16:14:53.563715 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kbp2n" podUID="55865e51-8124-4212-921c-3d892fb54901" containerName="registry-server" probeResult="failure" output=< Mar 20 16:14:53 crc kubenswrapper[4708]: timeout: failed to connect service ":50051" within 1s Mar 20 16:14:53 crc kubenswrapper[4708]: > Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.178875 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.179289 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.179354 4708 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.180421 4708 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5c914e606937f3e664e2e028c8e0d408cfdd520b7dcf21b2c2c87ba6f561a9d"} pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.180517 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" containerID="cri-o://d5c914e606937f3e664e2e028c8e0d408cfdd520b7dcf21b2c2c87ba6f561a9d" gracePeriod=600 Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.549360 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-pqzgj"] Mar 20 16:14:56 crc kubenswrapper[4708]: E0320 16:14:56.549608 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b39784f-91ed-47c3-a778-3bd4f77ca786" containerName="util" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.549628 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b39784f-91ed-47c3-a778-3bd4f77ca786" containerName="util" Mar 20 16:14:56 crc kubenswrapper[4708]: E0320 16:14:56.549651 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b39784f-91ed-47c3-a778-3bd4f77ca786" containerName="extract" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.549661 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b39784f-91ed-47c3-a778-3bd4f77ca786" containerName="extract" Mar 20 16:14:56 crc kubenswrapper[4708]: E0320 16:14:56.549694 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b39784f-91ed-47c3-a778-3bd4f77ca786" containerName="pull" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.549702 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b39784f-91ed-47c3-a778-3bd4f77ca786" containerName="pull" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.549818 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b39784f-91ed-47c3-a778-3bd4f77ca786" containerName="extract" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.550237 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-pqzgj" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.551768 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.551811 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-qmbr4" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.555608 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.562279 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-pqzgj"] Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.646791 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsxn9\" (UniqueName: \"kubernetes.io/projected/25642ab2-e76a-4bd3-83cf-24c5cc896ff5-kube-api-access-hsxn9\") pod \"nmstate-operator-796d4cfff4-pqzgj\" (UID: \"25642ab2-e76a-4bd3-83cf-24c5cc896ff5\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-pqzgj" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.706518 4708 generic.go:334] "Generic (PLEG): container finished" podID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerID="d5c914e606937f3e664e2e028c8e0d408cfdd520b7dcf21b2c2c87ba6f561a9d" exitCode=0 Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.706559 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerDied","Data":"d5c914e606937f3e664e2e028c8e0d408cfdd520b7dcf21b2c2c87ba6f561a9d"} Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.706589 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerStarted","Data":"b96515fa4390968ae59bbefc52b318ed0531df03cbc4138681a6e2acfc16a612"} Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.706614 4708 scope.go:117] "RemoveContainer" containerID="86c51de47a3ecb84e60b79380374d49d0675ffee3378ce7301ca5717dc50a07c" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.747464 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsxn9\" (UniqueName: \"kubernetes.io/projected/25642ab2-e76a-4bd3-83cf-24c5cc896ff5-kube-api-access-hsxn9\") pod \"nmstate-operator-796d4cfff4-pqzgj\" (UID: \"25642ab2-e76a-4bd3-83cf-24c5cc896ff5\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-pqzgj" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.768372 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsxn9\" (UniqueName: \"kubernetes.io/projected/25642ab2-e76a-4bd3-83cf-24c5cc896ff5-kube-api-access-hsxn9\") pod \"nmstate-operator-796d4cfff4-pqzgj\" (UID: \"25642ab2-e76a-4bd3-83cf-24c5cc896ff5\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-pqzgj" Mar 20 16:14:56 crc kubenswrapper[4708]: I0320 16:14:56.869186 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-pqzgj" Mar 20 16:14:57 crc kubenswrapper[4708]: I0320 16:14:57.260967 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-pqzgj"] Mar 20 16:14:57 crc kubenswrapper[4708]: I0320 16:14:57.713151 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-pqzgj" event={"ID":"25642ab2-e76a-4bd3-83cf-24c5cc896ff5","Type":"ContainerStarted","Data":"c00a67ecd922c4064c4ff71cdcb72ff5a3eb1b90ec93ec1ab3026a0357face18"} Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.151473 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6"] Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.152476 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.156193 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.156267 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.168761 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6"] Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.295717 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwpws\" (UniqueName: \"kubernetes.io/projected/cfe5754a-75cc-4d59-a5ae-489e646c50a0-kube-api-access-gwpws\") pod \"collect-profiles-29567055-7rtx6\" (UID: \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.295771 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfe5754a-75cc-4d59-a5ae-489e646c50a0-secret-volume\") pod \"collect-profiles-29567055-7rtx6\" (UID: \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.295848 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe5754a-75cc-4d59-a5ae-489e646c50a0-config-volume\") pod \"collect-profiles-29567055-7rtx6\" (UID: \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.397093 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwpws\" (UniqueName: \"kubernetes.io/projected/cfe5754a-75cc-4d59-a5ae-489e646c50a0-kube-api-access-gwpws\") pod \"collect-profiles-29567055-7rtx6\" (UID: \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.397153 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfe5754a-75cc-4d59-a5ae-489e646c50a0-secret-volume\") pod \"collect-profiles-29567055-7rtx6\" (UID: \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.397185 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe5754a-75cc-4d59-a5ae-489e646c50a0-config-volume\") pod \"collect-profiles-29567055-7rtx6\" (UID: \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.398342 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe5754a-75cc-4d59-a5ae-489e646c50a0-config-volume\") pod \"collect-profiles-29567055-7rtx6\" (UID: \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.411549 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfe5754a-75cc-4d59-a5ae-489e646c50a0-secret-volume\") pod \"collect-profiles-29567055-7rtx6\" (UID: \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.414298 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwpws\" (UniqueName: \"kubernetes.io/projected/cfe5754a-75cc-4d59-a5ae-489e646c50a0-kube-api-access-gwpws\") pod \"collect-profiles-29567055-7rtx6\" (UID: \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.470048 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:00 crc kubenswrapper[4708]: I0320 16:15:00.749951 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6"] Mar 20 16:15:00 crc kubenswrapper[4708]: W0320 16:15:00.752033 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfe5754a_75cc_4d59_a5ae_489e646c50a0.slice/crio-87922a2525b8a48944a9c022b3805dd7edd0aae53209a783ee1448849a46cb87 WatchSource:0}: Error finding container 87922a2525b8a48944a9c022b3805dd7edd0aae53209a783ee1448849a46cb87: Status 404 returned error can't find the container with id 87922a2525b8a48944a9c022b3805dd7edd0aae53209a783ee1448849a46cb87 Mar 20 16:15:01 crc kubenswrapper[4708]: I0320 16:15:01.742701 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-pqzgj" event={"ID":"25642ab2-e76a-4bd3-83cf-24c5cc896ff5","Type":"ContainerStarted","Data":"2acaf606176d1655ee23d4811e8417001b22bbdc245e0c6aabf2405db1e3671c"} Mar 20 16:15:01 crc kubenswrapper[4708]: I0320 16:15:01.745611 4708 generic.go:334] "Generic (PLEG): container finished" podID="cfe5754a-75cc-4d59-a5ae-489e646c50a0" containerID="49333db53c5b1efc19b01038ec07a618408dbcf4f053711cc96595d7ff7ab194" exitCode=0 Mar 20 16:15:01 crc kubenswrapper[4708]: I0320 16:15:01.745657 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" event={"ID":"cfe5754a-75cc-4d59-a5ae-489e646c50a0","Type":"ContainerDied","Data":"49333db53c5b1efc19b01038ec07a618408dbcf4f053711cc96595d7ff7ab194"} Mar 20 16:15:01 crc kubenswrapper[4708]: I0320 16:15:01.745879 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" event={"ID":"cfe5754a-75cc-4d59-a5ae-489e646c50a0","Type":"ContainerStarted","Data":"87922a2525b8a48944a9c022b3805dd7edd0aae53209a783ee1448849a46cb87"} Mar 20 16:15:01 crc kubenswrapper[4708]: I0320 16:15:01.762485 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-pqzgj" podStartSLOduration=2.402393702 podStartE2EDuration="5.762467083s" podCreationTimestamp="2026-03-20 16:14:56 +0000 UTC" firstStartedPulling="2026-03-20 16:14:57.271201057 +0000 UTC m=+851.945537772" lastFinishedPulling="2026-03-20 16:15:00.631274428 +0000 UTC m=+855.305611153" observedRunningTime="2026-03-20 16:15:01.75831706 +0000 UTC m=+856.432653775" watchObservedRunningTime="2026-03-20 16:15:01.762467083 +0000 UTC m=+856.436803798" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.555587 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.594781 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.783864 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-fv5f9"] Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.785184 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fv5f9" Mar 20 16:15:02 crc kubenswrapper[4708]: W0320 16:15:02.787562 4708 reflector.go:561] object-"openshift-nmstate"/"nmstate-handler-dockercfg-4cnf9": failed to list *v1.Secret: secrets "nmstate-handler-dockercfg-4cnf9" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Mar 20 16:15:02 crc kubenswrapper[4708]: E0320 16:15:02.787606 4708 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"nmstate-handler-dockercfg-4cnf9\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"nmstate-handler-dockercfg-4cnf9\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.809139 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2"] Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.810605 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.813433 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.824362 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-fv5f9"] Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.834747 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2"] Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.852405 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-qc6v2"] Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.856726 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.938452 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h26j5\" (UniqueName: \"kubernetes.io/projected/ecfb4ac0-f430-4e1b-ba99-1850c11ba37f-kube-api-access-h26j5\") pod \"nmstate-metrics-9b8c8685d-fv5f9\" (UID: \"ecfb4ac0-f430-4e1b-ba99-1850c11ba37f\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fv5f9" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.938550 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svvzh\" (UniqueName: \"kubernetes.io/projected/a8616b74-fe5d-49c2-9a69-a2448ae072b2-kube-api-access-svvzh\") pod \"nmstate-webhook-5f558f5558-v6ws2\" (UID: \"a8616b74-fe5d-49c2-9a69-a2448ae072b2\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.938590 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8616b74-fe5d-49c2-9a69-a2448ae072b2-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-v6ws2\" (UID: \"a8616b74-fe5d-49c2-9a69-a2448ae072b2\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.939637 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld"] Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.946714 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld"] Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.947455 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.950043 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-jtbqr" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.950331 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 16:15:02 crc kubenswrapper[4708]: I0320 16:15:02.950382 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.041142 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/aca1c7f0-410b-4e89-80a8-60f6005cee50-dbus-socket\") pod \"nmstate-handler-qc6v2\" (UID: \"aca1c7f0-410b-4e89-80a8-60f6005cee50\") " pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.041205 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h26j5\" (UniqueName: \"kubernetes.io/projected/ecfb4ac0-f430-4e1b-ba99-1850c11ba37f-kube-api-access-h26j5\") pod \"nmstate-metrics-9b8c8685d-fv5f9\" (UID: \"ecfb4ac0-f430-4e1b-ba99-1850c11ba37f\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fv5f9" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.041232 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2ts2\" (UniqueName: \"kubernetes.io/projected/aca1c7f0-410b-4e89-80a8-60f6005cee50-kube-api-access-l2ts2\") pod \"nmstate-handler-qc6v2\" (UID: \"aca1c7f0-410b-4e89-80a8-60f6005cee50\") " pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.041270 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/aca1c7f0-410b-4e89-80a8-60f6005cee50-ovs-socket\") pod \"nmstate-handler-qc6v2\" (UID: \"aca1c7f0-410b-4e89-80a8-60f6005cee50\") " pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.041296 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svvzh\" (UniqueName: \"kubernetes.io/projected/a8616b74-fe5d-49c2-9a69-a2448ae072b2-kube-api-access-svvzh\") pod \"nmstate-webhook-5f558f5558-v6ws2\" (UID: \"a8616b74-fe5d-49c2-9a69-a2448ae072b2\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.041324 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8616b74-fe5d-49c2-9a69-a2448ae072b2-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-v6ws2\" (UID: \"a8616b74-fe5d-49c2-9a69-a2448ae072b2\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.041349 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b64a32bd-b9f9-434c-98fa-7e178997d1f4-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-thkld\" (UID: \"b64a32bd-b9f9-434c-98fa-7e178997d1f4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.041364 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/aca1c7f0-410b-4e89-80a8-60f6005cee50-nmstate-lock\") pod \"nmstate-handler-qc6v2\" (UID: \"aca1c7f0-410b-4e89-80a8-60f6005cee50\") " pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.041379 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd5pc\" (UniqueName: \"kubernetes.io/projected/b64a32bd-b9f9-434c-98fa-7e178997d1f4-kube-api-access-nd5pc\") pod \"nmstate-console-plugin-86f58fcf4-thkld\" (UID: \"b64a32bd-b9f9-434c-98fa-7e178997d1f4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.041403 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b64a32bd-b9f9-434c-98fa-7e178997d1f4-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-thkld\" (UID: \"b64a32bd-b9f9-434c-98fa-7e178997d1f4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" Mar 20 16:15:03 crc kubenswrapper[4708]: E0320 16:15:03.041898 4708 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 20 16:15:03 crc kubenswrapper[4708]: E0320 16:15:03.041988 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8616b74-fe5d-49c2-9a69-a2448ae072b2-tls-key-pair podName:a8616b74-fe5d-49c2-9a69-a2448ae072b2 nodeName:}" failed. No retries permitted until 2026-03-20 16:15:03.541966734 +0000 UTC m=+858.216303449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/a8616b74-fe5d-49c2-9a69-a2448ae072b2-tls-key-pair") pod "nmstate-webhook-5f558f5558-v6ws2" (UID: "a8616b74-fe5d-49c2-9a69-a2448ae072b2") : secret "openshift-nmstate-webhook" not found Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.064035 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h26j5\" (UniqueName: \"kubernetes.io/projected/ecfb4ac0-f430-4e1b-ba99-1850c11ba37f-kube-api-access-h26j5\") pod \"nmstate-metrics-9b8c8685d-fv5f9\" (UID: \"ecfb4ac0-f430-4e1b-ba99-1850c11ba37f\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fv5f9" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.064639 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svvzh\" (UniqueName: \"kubernetes.io/projected/a8616b74-fe5d-49c2-9a69-a2448ae072b2-kube-api-access-svvzh\") pod \"nmstate-webhook-5f558f5558-v6ws2\" (UID: \"a8616b74-fe5d-49c2-9a69-a2448ae072b2\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.105435 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.142699 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/aca1c7f0-410b-4e89-80a8-60f6005cee50-ovs-socket\") pod \"nmstate-handler-qc6v2\" (UID: \"aca1c7f0-410b-4e89-80a8-60f6005cee50\") " pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.142792 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b64a32bd-b9f9-434c-98fa-7e178997d1f4-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-thkld\" (UID: \"b64a32bd-b9f9-434c-98fa-7e178997d1f4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.142808 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/aca1c7f0-410b-4e89-80a8-60f6005cee50-nmstate-lock\") pod \"nmstate-handler-qc6v2\" (UID: \"aca1c7f0-410b-4e89-80a8-60f6005cee50\") " pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.142824 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd5pc\" (UniqueName: \"kubernetes.io/projected/b64a32bd-b9f9-434c-98fa-7e178997d1f4-kube-api-access-nd5pc\") pod \"nmstate-console-plugin-86f58fcf4-thkld\" (UID: \"b64a32bd-b9f9-434c-98fa-7e178997d1f4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.142819 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/aca1c7f0-410b-4e89-80a8-60f6005cee50-ovs-socket\") pod \"nmstate-handler-qc6v2\" (UID: \"aca1c7f0-410b-4e89-80a8-60f6005cee50\") " pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.142846 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b64a32bd-b9f9-434c-98fa-7e178997d1f4-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-thkld\" (UID: \"b64a32bd-b9f9-434c-98fa-7e178997d1f4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.142914 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/aca1c7f0-410b-4e89-80a8-60f6005cee50-nmstate-lock\") pod \"nmstate-handler-qc6v2\" (UID: \"aca1c7f0-410b-4e89-80a8-60f6005cee50\") " pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.142954 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/aca1c7f0-410b-4e89-80a8-60f6005cee50-dbus-socket\") pod \"nmstate-handler-qc6v2\" (UID: \"aca1c7f0-410b-4e89-80a8-60f6005cee50\") " pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.143011 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2ts2\" (UniqueName: \"kubernetes.io/projected/aca1c7f0-410b-4e89-80a8-60f6005cee50-kube-api-access-l2ts2\") pod \"nmstate-handler-qc6v2\" (UID: \"aca1c7f0-410b-4e89-80a8-60f6005cee50\") " pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.143363 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/aca1c7f0-410b-4e89-80a8-60f6005cee50-dbus-socket\") pod \"nmstate-handler-qc6v2\" (UID: \"aca1c7f0-410b-4e89-80a8-60f6005cee50\") " pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.143984 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b64a32bd-b9f9-434c-98fa-7e178997d1f4-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-thkld\" (UID: \"b64a32bd-b9f9-434c-98fa-7e178997d1f4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.147296 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b64a32bd-b9f9-434c-98fa-7e178997d1f4-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-thkld\" (UID: \"b64a32bd-b9f9-434c-98fa-7e178997d1f4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.160049 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-8548c6c44d-2n9w2"] Mar 20 16:15:03 crc kubenswrapper[4708]: E0320 16:15:03.160272 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe5754a-75cc-4d59-a5ae-489e646c50a0" containerName="collect-profiles" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.160283 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe5754a-75cc-4d59-a5ae-489e646c50a0" containerName="collect-profiles" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.160374 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe5754a-75cc-4d59-a5ae-489e646c50a0" containerName="collect-profiles" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.160766 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.171567 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2ts2\" (UniqueName: \"kubernetes.io/projected/aca1c7f0-410b-4e89-80a8-60f6005cee50-kube-api-access-l2ts2\") pod \"nmstate-handler-qc6v2\" (UID: \"aca1c7f0-410b-4e89-80a8-60f6005cee50\") " pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.174232 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd5pc\" (UniqueName: \"kubernetes.io/projected/b64a32bd-b9f9-434c-98fa-7e178997d1f4-kube-api-access-nd5pc\") pod \"nmstate-console-plugin-86f58fcf4-thkld\" (UID: \"b64a32bd-b9f9-434c-98fa-7e178997d1f4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.176851 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8548c6c44d-2n9w2"] Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.243762 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwpws\" (UniqueName: \"kubernetes.io/projected/cfe5754a-75cc-4d59-a5ae-489e646c50a0-kube-api-access-gwpws\") pod \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\" (UID: \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\") " Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.243860 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe5754a-75cc-4d59-a5ae-489e646c50a0-config-volume\") pod \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\" (UID: \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\") " Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.243894 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfe5754a-75cc-4d59-a5ae-489e646c50a0-secret-volume\") pod \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\" (UID: \"cfe5754a-75cc-4d59-a5ae-489e646c50a0\") " Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.244760 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe5754a-75cc-4d59-a5ae-489e646c50a0-config-volume" (OuterVolumeSpecName: "config-volume") pod "cfe5754a-75cc-4d59-a5ae-489e646c50a0" (UID: "cfe5754a-75cc-4d59-a5ae-489e646c50a0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.247119 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe5754a-75cc-4d59-a5ae-489e646c50a0-kube-api-access-gwpws" (OuterVolumeSpecName: "kube-api-access-gwpws") pod "cfe5754a-75cc-4d59-a5ae-489e646c50a0" (UID: "cfe5754a-75cc-4d59-a5ae-489e646c50a0"). InnerVolumeSpecName "kube-api-access-gwpws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.247236 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe5754a-75cc-4d59-a5ae-489e646c50a0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cfe5754a-75cc-4d59-a5ae-489e646c50a0" (UID: "cfe5754a-75cc-4d59-a5ae-489e646c50a0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.275587 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.347094 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5dh7\" (UniqueName: \"kubernetes.io/projected/e2727730-0255-4e1c-9db4-9b47387e725e-kube-api-access-f5dh7\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.347135 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2727730-0255-4e1c-9db4-9b47387e725e-console-oauth-config\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.347171 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2727730-0255-4e1c-9db4-9b47387e725e-console-config\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.347301 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2727730-0255-4e1c-9db4-9b47387e725e-trusted-ca-bundle\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.347380 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2727730-0255-4e1c-9db4-9b47387e725e-console-serving-cert\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.347415 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2727730-0255-4e1c-9db4-9b47387e725e-service-ca\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.347523 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2727730-0255-4e1c-9db4-9b47387e725e-oauth-serving-cert\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.347595 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwpws\" (UniqueName: \"kubernetes.io/projected/cfe5754a-75cc-4d59-a5ae-489e646c50a0-kube-api-access-gwpws\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.347607 4708 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe5754a-75cc-4d59-a5ae-489e646c50a0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.347617 4708 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfe5754a-75cc-4d59-a5ae-489e646c50a0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.453053 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5dh7\" (UniqueName: \"kubernetes.io/projected/e2727730-0255-4e1c-9db4-9b47387e725e-kube-api-access-f5dh7\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.453137 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2727730-0255-4e1c-9db4-9b47387e725e-console-oauth-config\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.453166 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2727730-0255-4e1c-9db4-9b47387e725e-console-config\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.453205 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2727730-0255-4e1c-9db4-9b47387e725e-trusted-ca-bundle\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.453250 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2727730-0255-4e1c-9db4-9b47387e725e-console-serving-cert\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.453305 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2727730-0255-4e1c-9db4-9b47387e725e-service-ca\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.453357 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2727730-0255-4e1c-9db4-9b47387e725e-oauth-serving-cert\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.454583 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2727730-0255-4e1c-9db4-9b47387e725e-oauth-serving-cert\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.455167 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2727730-0255-4e1c-9db4-9b47387e725e-trusted-ca-bundle\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.455982 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2727730-0255-4e1c-9db4-9b47387e725e-console-config\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.457900 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2727730-0255-4e1c-9db4-9b47387e725e-service-ca\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.459354 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2727730-0255-4e1c-9db4-9b47387e725e-console-oauth-config\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.459489 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2727730-0255-4e1c-9db4-9b47387e725e-console-serving-cert\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.474473 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld"] Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.477317 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5dh7\" (UniqueName: \"kubernetes.io/projected/e2727730-0255-4e1c-9db4-9b47387e725e-kube-api-access-f5dh7\") pod \"console-8548c6c44d-2n9w2\" (UID: \"e2727730-0255-4e1c-9db4-9b47387e725e\") " pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.486764 4708 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.532042 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.554866 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8616b74-fe5d-49c2-9a69-a2448ae072b2-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-v6ws2\" (UID: \"a8616b74-fe5d-49c2-9a69-a2448ae072b2\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.559587 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8616b74-fe5d-49c2-9a69-a2448ae072b2-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-v6ws2\" (UID: \"a8616b74-fe5d-49c2-9a69-a2448ae072b2\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.737165 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8548c6c44d-2n9w2"] Mar 20 16:15:03 crc kubenswrapper[4708]: W0320 16:15:03.744987 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2727730_0255_4e1c_9db4_9b47387e725e.slice/crio-d94849922f10206955da9e44cde422bed5639e0f404f418be8fa54abae4f4285 WatchSource:0}: Error finding container d94849922f10206955da9e44cde422bed5639e0f404f418be8fa54abae4f4285: Status 404 returned error can't find the container with id d94849922f10206955da9e44cde422bed5639e0f404f418be8fa54abae4f4285 Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.756363 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.756345 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-7rtx6" event={"ID":"cfe5754a-75cc-4d59-a5ae-489e646c50a0","Type":"ContainerDied","Data":"87922a2525b8a48944a9c022b3805dd7edd0aae53209a783ee1448849a46cb87"} Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.756934 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87922a2525b8a48944a9c022b3805dd7edd0aae53209a783ee1448849a46cb87" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.758088 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8548c6c44d-2n9w2" event={"ID":"e2727730-0255-4e1c-9db4-9b47387e725e","Type":"ContainerStarted","Data":"d94849922f10206955da9e44cde422bed5639e0f404f418be8fa54abae4f4285"} Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.759603 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" event={"ID":"b64a32bd-b9f9-434c-98fa-7e178997d1f4","Type":"ContainerStarted","Data":"c53b219db418d4c96f0302e8edbb19915f27161700c585e6e3834368f75a905c"} Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.796513 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-4cnf9" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.799315 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.799361 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fv5f9" Mar 20 16:15:03 crc kubenswrapper[4708]: I0320 16:15:03.801293 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" Mar 20 16:15:03 crc kubenswrapper[4708]: W0320 16:15:03.833751 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaca1c7f0_410b_4e89_80a8_60f6005cee50.slice/crio-b599dbc07acc5f720daf40c7bc7280b85da33559d0a9db9276234518af659325 WatchSource:0}: Error finding container b599dbc07acc5f720daf40c7bc7280b85da33559d0a9db9276234518af659325: Status 404 returned error can't find the container with id b599dbc07acc5f720daf40c7bc7280b85da33559d0a9db9276234518af659325 Mar 20 16:15:04 crc kubenswrapper[4708]: I0320 16:15:04.058281 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-fv5f9"] Mar 20 16:15:04 crc kubenswrapper[4708]: W0320 16:15:04.078404 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecfb4ac0_f430_4e1b_ba99_1850c11ba37f.slice/crio-feb69d3ff21195708339e5fd96faa6518cfc543e94977427b6e0e5a3d2da6ebb WatchSource:0}: Error finding container feb69d3ff21195708339e5fd96faa6518cfc543e94977427b6e0e5a3d2da6ebb: Status 404 returned error can't find the container with id feb69d3ff21195708339e5fd96faa6518cfc543e94977427b6e0e5a3d2da6ebb Mar 20 16:15:04 crc kubenswrapper[4708]: I0320 16:15:04.103691 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2"] Mar 20 16:15:04 crc kubenswrapper[4708]: I0320 16:15:04.716975 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kbp2n"] Mar 20 16:15:04 crc kubenswrapper[4708]: I0320 16:15:04.717256 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kbp2n" podUID="55865e51-8124-4212-921c-3d892fb54901" containerName="registry-server" containerID="cri-o://43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957" gracePeriod=2 Mar 20 16:15:04 crc kubenswrapper[4708]: I0320 16:15:04.767230 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fv5f9" event={"ID":"ecfb4ac0-f430-4e1b-ba99-1850c11ba37f","Type":"ContainerStarted","Data":"feb69d3ff21195708339e5fd96faa6518cfc543e94977427b6e0e5a3d2da6ebb"} Mar 20 16:15:04 crc kubenswrapper[4708]: I0320 16:15:04.768488 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" event={"ID":"a8616b74-fe5d-49c2-9a69-a2448ae072b2","Type":"ContainerStarted","Data":"b423cba5bc42157508fe33e1a7d7e567c0c770cd0508d59737645b3ed0123b1b"} Mar 20 16:15:04 crc kubenswrapper[4708]: I0320 16:15:04.770401 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8548c6c44d-2n9w2" event={"ID":"e2727730-0255-4e1c-9db4-9b47387e725e","Type":"ContainerStarted","Data":"c235bc25df7b9892df2f60c9cd862ff6a1104a26bd646b9e5d923edde0574c4d"} Mar 20 16:15:04 crc kubenswrapper[4708]: I0320 16:15:04.771618 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qc6v2" event={"ID":"aca1c7f0-410b-4e89-80a8-60f6005cee50","Type":"ContainerStarted","Data":"b599dbc07acc5f720daf40c7bc7280b85da33559d0a9db9276234518af659325"} Mar 20 16:15:04 crc kubenswrapper[4708]: I0320 16:15:04.789798 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8548c6c44d-2n9w2" podStartSLOduration=1.789773203 podStartE2EDuration="1.789773203s" podCreationTimestamp="2026-03-20 16:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:15:04.78672954 +0000 UTC m=+859.461066255" watchObservedRunningTime="2026-03-20 16:15:04.789773203 +0000 UTC m=+859.464109918" Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.097018 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.178963 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25jj8\" (UniqueName: \"kubernetes.io/projected/55865e51-8124-4212-921c-3d892fb54901-kube-api-access-25jj8\") pod \"55865e51-8124-4212-921c-3d892fb54901\" (UID: \"55865e51-8124-4212-921c-3d892fb54901\") " Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.179032 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55865e51-8124-4212-921c-3d892fb54901-utilities\") pod \"55865e51-8124-4212-921c-3d892fb54901\" (UID: \"55865e51-8124-4212-921c-3d892fb54901\") " Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.179065 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55865e51-8124-4212-921c-3d892fb54901-catalog-content\") pod \"55865e51-8124-4212-921c-3d892fb54901\" (UID: \"55865e51-8124-4212-921c-3d892fb54901\") " Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.180632 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55865e51-8124-4212-921c-3d892fb54901-utilities" (OuterVolumeSpecName: "utilities") pod "55865e51-8124-4212-921c-3d892fb54901" (UID: "55865e51-8124-4212-921c-3d892fb54901"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.195808 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55865e51-8124-4212-921c-3d892fb54901-kube-api-access-25jj8" (OuterVolumeSpecName: "kube-api-access-25jj8") pod "55865e51-8124-4212-921c-3d892fb54901" (UID: "55865e51-8124-4212-921c-3d892fb54901"). InnerVolumeSpecName "kube-api-access-25jj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.286119 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55865e51-8124-4212-921c-3d892fb54901-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.286163 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25jj8\" (UniqueName: \"kubernetes.io/projected/55865e51-8124-4212-921c-3d892fb54901-kube-api-access-25jj8\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.303659 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55865e51-8124-4212-921c-3d892fb54901-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55865e51-8124-4212-921c-3d892fb54901" (UID: "55865e51-8124-4212-921c-3d892fb54901"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.387612 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55865e51-8124-4212-921c-3d892fb54901-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.780005 4708 generic.go:334] "Generic (PLEG): container finished" podID="55865e51-8124-4212-921c-3d892fb54901" containerID="43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957" exitCode=0 Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.780109 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbp2n" Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.780114 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbp2n" event={"ID":"55865e51-8124-4212-921c-3d892fb54901","Type":"ContainerDied","Data":"43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957"} Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.780180 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbp2n" event={"ID":"55865e51-8124-4212-921c-3d892fb54901","Type":"ContainerDied","Data":"aef55e9f9a3f6993978e9f77679c1245819ff517dc31c982b959fa6ad1fa501e"} Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.780203 4708 scope.go:117] "RemoveContainer" containerID="43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957" Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.816889 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kbp2n"] Mar 20 16:15:05 crc kubenswrapper[4708]: I0320 16:15:05.823264 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kbp2n"] Mar 20 16:15:06 crc kubenswrapper[4708]: I0320 16:15:06.032645 4708 scope.go:117] "RemoveContainer" containerID="0ac298f47802dac5ddf9d5b20cc1d623616c66b4c02ba0a92fe16f61817a56de" Mar 20 16:15:06 crc kubenswrapper[4708]: I0320 16:15:06.085945 4708 scope.go:117] "RemoveContainer" containerID="977ec6f3dd72ceb7127194c8d7714b7d581c563bf40074a66bb0e6334e14a9f9" Mar 20 16:15:06 crc kubenswrapper[4708]: I0320 16:15:06.100525 4708 scope.go:117] "RemoveContainer" containerID="43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957" Mar 20 16:15:06 crc kubenswrapper[4708]: E0320 16:15:06.102046 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957\": container with ID starting with 43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957 not found: ID does not exist" containerID="43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957" Mar 20 16:15:06 crc kubenswrapper[4708]: I0320 16:15:06.102079 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957"} err="failed to get container status \"43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957\": rpc error: code = NotFound desc = could not find container \"43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957\": container with ID starting with 43b91e530717984246577e7c97040f57258a104ada131237826fc09b6c134957 not found: ID does not exist" Mar 20 16:15:06 crc kubenswrapper[4708]: I0320 16:15:06.102104 4708 scope.go:117] "RemoveContainer" containerID="0ac298f47802dac5ddf9d5b20cc1d623616c66b4c02ba0a92fe16f61817a56de" Mar 20 16:15:06 crc kubenswrapper[4708]: E0320 16:15:06.102496 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac298f47802dac5ddf9d5b20cc1d623616c66b4c02ba0a92fe16f61817a56de\": container with ID starting with 0ac298f47802dac5ddf9d5b20cc1d623616c66b4c02ba0a92fe16f61817a56de not found: ID does not exist" containerID="0ac298f47802dac5ddf9d5b20cc1d623616c66b4c02ba0a92fe16f61817a56de" Mar 20 16:15:06 crc kubenswrapper[4708]: I0320 16:15:06.102527 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac298f47802dac5ddf9d5b20cc1d623616c66b4c02ba0a92fe16f61817a56de"} err="failed to get container status \"0ac298f47802dac5ddf9d5b20cc1d623616c66b4c02ba0a92fe16f61817a56de\": rpc error: code = NotFound desc = could not find container \"0ac298f47802dac5ddf9d5b20cc1d623616c66b4c02ba0a92fe16f61817a56de\": container with ID starting with 0ac298f47802dac5ddf9d5b20cc1d623616c66b4c02ba0a92fe16f61817a56de not found: ID does not exist" Mar 20 16:15:06 crc kubenswrapper[4708]: I0320 16:15:06.102547 4708 scope.go:117] "RemoveContainer" containerID="977ec6f3dd72ceb7127194c8d7714b7d581c563bf40074a66bb0e6334e14a9f9" Mar 20 16:15:06 crc kubenswrapper[4708]: E0320 16:15:06.102926 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"977ec6f3dd72ceb7127194c8d7714b7d581c563bf40074a66bb0e6334e14a9f9\": container with ID starting with 977ec6f3dd72ceb7127194c8d7714b7d581c563bf40074a66bb0e6334e14a9f9 not found: ID does not exist" containerID="977ec6f3dd72ceb7127194c8d7714b7d581c563bf40074a66bb0e6334e14a9f9" Mar 20 16:15:06 crc kubenswrapper[4708]: I0320 16:15:06.102961 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"977ec6f3dd72ceb7127194c8d7714b7d581c563bf40074a66bb0e6334e14a9f9"} err="failed to get container status \"977ec6f3dd72ceb7127194c8d7714b7d581c563bf40074a66bb0e6334e14a9f9\": rpc error: code = NotFound desc = could not find container \"977ec6f3dd72ceb7127194c8d7714b7d581c563bf40074a66bb0e6334e14a9f9\": container with ID starting with 977ec6f3dd72ceb7127194c8d7714b7d581c563bf40074a66bb0e6334e14a9f9 not found: ID does not exist" Mar 20 16:15:06 crc kubenswrapper[4708]: I0320 16:15:06.122994 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55865e51-8124-4212-921c-3d892fb54901" path="/var/lib/kubelet/pods/55865e51-8124-4212-921c-3d892fb54901/volumes" Mar 20 16:15:06 crc kubenswrapper[4708]: I0320 16:15:06.787531 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" event={"ID":"b64a32bd-b9f9-434c-98fa-7e178997d1f4","Type":"ContainerStarted","Data":"3d205f558f693f490a9b470fbc3d8a8716a3107d7bc140d3467bf9349bfc782a"} Mar 20 16:15:06 crc kubenswrapper[4708]: I0320 16:15:06.806756 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-thkld" podStartSLOduration=2.188226047 podStartE2EDuration="4.80673356s" podCreationTimestamp="2026-03-20 16:15:02 +0000 UTC" firstStartedPulling="2026-03-20 16:15:03.486470074 +0000 UTC m=+858.160806789" lastFinishedPulling="2026-03-20 16:15:06.104977587 +0000 UTC m=+860.779314302" observedRunningTime="2026-03-20 16:15:06.803872612 +0000 UTC m=+861.478209327" watchObservedRunningTime="2026-03-20 16:15:06.80673356 +0000 UTC m=+861.481070275" Mar 20 16:15:08 crc kubenswrapper[4708]: I0320 16:15:08.806464 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fv5f9" event={"ID":"ecfb4ac0-f430-4e1b-ba99-1850c11ba37f","Type":"ContainerStarted","Data":"7d7fd75395954645510ed0ed7608b6d4deff559af43c97739400f251eb63f016"} Mar 20 16:15:08 crc kubenswrapper[4708]: I0320 16:15:08.809039 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" event={"ID":"a8616b74-fe5d-49c2-9a69-a2448ae072b2","Type":"ContainerStarted","Data":"08f9a5f9bd5bbc031a52493e1819c855061ee889ce4572c60ddc2b583d9e5f60"} Mar 20 16:15:08 crc kubenswrapper[4708]: I0320 16:15:08.809208 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" Mar 20 16:15:08 crc kubenswrapper[4708]: I0320 16:15:08.810626 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qc6v2" event={"ID":"aca1c7f0-410b-4e89-80a8-60f6005cee50","Type":"ContainerStarted","Data":"90cd4659b1e3bf572a0d84add1304f1f0b081cedaac09da501374230a63aa236"} Mar 20 16:15:08 crc kubenswrapper[4708]: I0320 16:15:08.810849 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:08 crc kubenswrapper[4708]: I0320 16:15:08.826030 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" podStartSLOduration=2.924220573 podStartE2EDuration="6.826010229s" podCreationTimestamp="2026-03-20 16:15:02 +0000 UTC" firstStartedPulling="2026-03-20 16:15:04.111445028 +0000 UTC m=+858.785781743" lastFinishedPulling="2026-03-20 16:15:08.013234684 +0000 UTC m=+862.687571399" observedRunningTime="2026-03-20 16:15:08.824351944 +0000 UTC m=+863.498688679" watchObservedRunningTime="2026-03-20 16:15:08.826010229 +0000 UTC m=+863.500346974" Mar 20 16:15:08 crc kubenswrapper[4708]: I0320 16:15:08.844537 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-qc6v2" podStartSLOduration=2.655430576 podStartE2EDuration="6.844515543s" podCreationTimestamp="2026-03-20 16:15:02 +0000 UTC" firstStartedPulling="2026-03-20 16:15:03.839128134 +0000 UTC m=+858.513464849" lastFinishedPulling="2026-03-20 16:15:08.028213101 +0000 UTC m=+862.702549816" observedRunningTime="2026-03-20 16:15:08.841949983 +0000 UTC m=+863.516286698" watchObservedRunningTime="2026-03-20 16:15:08.844515543 +0000 UTC m=+863.518852258" Mar 20 16:15:11 crc kubenswrapper[4708]: I0320 16:15:11.178005 4708 scope.go:117] "RemoveContainer" containerID="4eccf946645350d85645c5cec6efbd3ad200868cd3f27bfe46981cb45c67e972" Mar 20 16:15:12 crc kubenswrapper[4708]: I0320 16:15:12.838211 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fv5f9" event={"ID":"ecfb4ac0-f430-4e1b-ba99-1850c11ba37f","Type":"ContainerStarted","Data":"2968b7c22542c6887c3ebd022a0442beae07fb1ee7ec8687f2e0a873ce7e22d9"} Mar 20 16:15:12 crc kubenswrapper[4708]: I0320 16:15:12.860301 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fv5f9" podStartSLOduration=2.57368753 podStartE2EDuration="10.860274822s" podCreationTimestamp="2026-03-20 16:15:02 +0000 UTC" firstStartedPulling="2026-03-20 16:15:04.082766686 +0000 UTC m=+858.757103401" lastFinishedPulling="2026-03-20 16:15:12.369353978 +0000 UTC m=+867.043690693" observedRunningTime="2026-03-20 16:15:12.85471164 +0000 UTC m=+867.529048355" watchObservedRunningTime="2026-03-20 16:15:12.860274822 +0000 UTC m=+867.534611567" Mar 20 16:15:13 crc kubenswrapper[4708]: I0320 16:15:13.532474 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:13 crc kubenswrapper[4708]: I0320 16:15:13.532524 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:13 crc kubenswrapper[4708]: I0320 16:15:13.892504 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:13 crc kubenswrapper[4708]: I0320 16:15:13.924484 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8548c6c44d-2n9w2" Mar 20 16:15:13 crc kubenswrapper[4708]: I0320 16:15:13.938891 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-qc6v2" Mar 20 16:15:14 crc kubenswrapper[4708]: I0320 16:15:14.016269 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-kdrms"] Mar 20 16:15:23 crc kubenswrapper[4708]: I0320 16:15:23.807883 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-v6ws2" Mar 20 16:15:36 crc kubenswrapper[4708]: I0320 16:15:36.884098 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z"] Mar 20 16:15:36 crc kubenswrapper[4708]: E0320 16:15:36.885149 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55865e51-8124-4212-921c-3d892fb54901" containerName="extract-content" Mar 20 16:15:36 crc kubenswrapper[4708]: I0320 16:15:36.885166 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="55865e51-8124-4212-921c-3d892fb54901" containerName="extract-content" Mar 20 16:15:36 crc kubenswrapper[4708]: E0320 16:15:36.885197 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55865e51-8124-4212-921c-3d892fb54901" containerName="registry-server" Mar 20 16:15:36 crc kubenswrapper[4708]: I0320 16:15:36.885220 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="55865e51-8124-4212-921c-3d892fb54901" containerName="registry-server" Mar 20 16:15:36 crc kubenswrapper[4708]: E0320 16:15:36.885226 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55865e51-8124-4212-921c-3d892fb54901" containerName="extract-utilities" Mar 20 16:15:36 crc kubenswrapper[4708]: I0320 16:15:36.885233 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="55865e51-8124-4212-921c-3d892fb54901" containerName="extract-utilities" Mar 20 16:15:36 crc kubenswrapper[4708]: I0320 16:15:36.885377 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="55865e51-8124-4212-921c-3d892fb54901" containerName="registry-server" Mar 20 16:15:36 crc kubenswrapper[4708]: I0320 16:15:36.886434 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:15:36 crc kubenswrapper[4708]: I0320 16:15:36.892437 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 16:15:36 crc kubenswrapper[4708]: I0320 16:15:36.897253 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z"] Mar 20 16:15:36 crc kubenswrapper[4708]: I0320 16:15:36.921882 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z\" (UID: \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:15:36 crc kubenswrapper[4708]: I0320 16:15:36.921980 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z\" (UID: \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:15:36 crc kubenswrapper[4708]: I0320 16:15:36.922013 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfl2h\" (UniqueName: \"kubernetes.io/projected/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-kube-api-access-lfl2h\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z\" (UID: \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:15:37 crc kubenswrapper[4708]: I0320 16:15:37.022893 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z\" (UID: \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:15:37 crc kubenswrapper[4708]: I0320 16:15:37.023246 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z\" (UID: \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:15:37 crc kubenswrapper[4708]: I0320 16:15:37.023373 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfl2h\" (UniqueName: \"kubernetes.io/projected/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-kube-api-access-lfl2h\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z\" (UID: \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:15:37 crc kubenswrapper[4708]: I0320 16:15:37.023426 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z\" (UID: \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:15:37 crc kubenswrapper[4708]: I0320 16:15:37.023711 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z\" (UID: \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:15:37 crc kubenswrapper[4708]: I0320 16:15:37.042511 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfl2h\" (UniqueName: \"kubernetes.io/projected/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-kube-api-access-lfl2h\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z\" (UID: \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:15:37 crc kubenswrapper[4708]: I0320 16:15:37.202063 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:15:37 crc kubenswrapper[4708]: I0320 16:15:37.618383 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z"] Mar 20 16:15:38 crc kubenswrapper[4708]: I0320 16:15:38.079750 4708 generic.go:334] "Generic (PLEG): container finished" podID="199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" containerID="bb2d6b356909ff82b005c07b6e822f9d064a63316bfb8bc6340d67b200542243" exitCode=0 Mar 20 16:15:38 crc kubenswrapper[4708]: I0320 16:15:38.079796 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" event={"ID":"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2","Type":"ContainerDied","Data":"bb2d6b356909ff82b005c07b6e822f9d064a63316bfb8bc6340d67b200542243"} Mar 20 16:15:38 crc kubenswrapper[4708]: I0320 16:15:38.080033 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" event={"ID":"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2","Type":"ContainerStarted","Data":"0bb4096ec883cb9c1298461fd746c99cb2d01de0dba42073cf63f04df1c4e640"} Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.072291 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-kdrms" podUID="77884518-e4d9-4a61-b8fb-55b1e2f9e23a" containerName="console" containerID="cri-o://d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb" gracePeriod=15 Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.400094 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-kdrms_77884518-e4d9-4a61-b8fb-55b1e2f9e23a/console/0.log" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.400164 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.552547 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-config\") pod \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.552636 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-oauth-config\") pod \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.552685 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-serving-cert\") pod \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.552729 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-oauth-serving-cert\") pod \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.552755 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-trusted-ca-bundle\") pod \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.552851 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dzlh\" (UniqueName: \"kubernetes.io/projected/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-kube-api-access-6dzlh\") pod \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.552917 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-service-ca\") pod \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\" (UID: \"77884518-e4d9-4a61-b8fb-55b1e2f9e23a\") " Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.553880 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "77884518-e4d9-4a61-b8fb-55b1e2f9e23a" (UID: "77884518-e4d9-4a61-b8fb-55b1e2f9e23a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.553893 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-service-ca" (OuterVolumeSpecName: "service-ca") pod "77884518-e4d9-4a61-b8fb-55b1e2f9e23a" (UID: "77884518-e4d9-4a61-b8fb-55b1e2f9e23a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.554245 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "77884518-e4d9-4a61-b8fb-55b1e2f9e23a" (UID: "77884518-e4d9-4a61-b8fb-55b1e2f9e23a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.554553 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-config" (OuterVolumeSpecName: "console-config") pod "77884518-e4d9-4a61-b8fb-55b1e2f9e23a" (UID: "77884518-e4d9-4a61-b8fb-55b1e2f9e23a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.570605 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "77884518-e4d9-4a61-b8fb-55b1e2f9e23a" (UID: "77884518-e4d9-4a61-b8fb-55b1e2f9e23a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.571268 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "77884518-e4d9-4a61-b8fb-55b1e2f9e23a" (UID: "77884518-e4d9-4a61-b8fb-55b1e2f9e23a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.572344 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-kube-api-access-6dzlh" (OuterVolumeSpecName: "kube-api-access-6dzlh") pod "77884518-e4d9-4a61-b8fb-55b1e2f9e23a" (UID: "77884518-e4d9-4a61-b8fb-55b1e2f9e23a"). InnerVolumeSpecName "kube-api-access-6dzlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.654409 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dzlh\" (UniqueName: \"kubernetes.io/projected/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-kube-api-access-6dzlh\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.654451 4708 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.654466 4708 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.654477 4708 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.654490 4708 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.654501 4708 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:39 crc kubenswrapper[4708]: I0320 16:15:39.654511 4708 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77884518-e4d9-4a61-b8fb-55b1e2f9e23a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:40 crc kubenswrapper[4708]: I0320 16:15:40.093726 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-kdrms_77884518-e4d9-4a61-b8fb-55b1e2f9e23a/console/0.log" Mar 20 16:15:40 crc kubenswrapper[4708]: I0320 16:15:40.094056 4708 generic.go:334] "Generic (PLEG): container finished" podID="77884518-e4d9-4a61-b8fb-55b1e2f9e23a" containerID="d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb" exitCode=2 Mar 20 16:15:40 crc kubenswrapper[4708]: I0320 16:15:40.094088 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kdrms" event={"ID":"77884518-e4d9-4a61-b8fb-55b1e2f9e23a","Type":"ContainerDied","Data":"d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb"} Mar 20 16:15:40 crc kubenswrapper[4708]: I0320 16:15:40.094112 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kdrms" event={"ID":"77884518-e4d9-4a61-b8fb-55b1e2f9e23a","Type":"ContainerDied","Data":"e6dc4f59b5c56a31fb8caa657527ba5b0d8ae42ee0a55d62c4db3d20467d88b2"} Mar 20 16:15:40 crc kubenswrapper[4708]: I0320 16:15:40.094145 4708 scope.go:117] "RemoveContainer" containerID="d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb" Mar 20 16:15:40 crc kubenswrapper[4708]: I0320 16:15:40.094222 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kdrms" Mar 20 16:15:40 crc kubenswrapper[4708]: I0320 16:15:40.111833 4708 scope.go:117] "RemoveContainer" containerID="d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb" Mar 20 16:15:40 crc kubenswrapper[4708]: E0320 16:15:40.112247 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb\": container with ID starting with d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb not found: ID does not exist" containerID="d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb" Mar 20 16:15:40 crc kubenswrapper[4708]: I0320 16:15:40.112287 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb"} err="failed to get container status \"d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb\": rpc error: code = NotFound desc = could not find container \"d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb\": container with ID starting with d1d827289405fb4fa630ca27a7dfa6df5dc607d0c2a2c3f648b14e4f9975d6cb not found: ID does not exist" Mar 20 16:15:40 crc kubenswrapper[4708]: I0320 16:15:40.138207 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-kdrms"] Mar 20 16:15:40 crc kubenswrapper[4708]: I0320 16:15:40.144836 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-kdrms"] Mar 20 16:15:42 crc kubenswrapper[4708]: I0320 16:15:42.118320 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77884518-e4d9-4a61-b8fb-55b1e2f9e23a" path="/var/lib/kubelet/pods/77884518-e4d9-4a61-b8fb-55b1e2f9e23a/volumes" Mar 20 16:15:44 crc kubenswrapper[4708]: I0320 16:15:44.119568 4708 generic.go:334] "Generic (PLEG): container finished" podID="199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" containerID="0f7c11881caa664aabbd6339e4a9a1a276ffe52174311784ef11c8024f436fa5" exitCode=0 Mar 20 16:15:44 crc kubenswrapper[4708]: I0320 16:15:44.119612 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" event={"ID":"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2","Type":"ContainerDied","Data":"0f7c11881caa664aabbd6339e4a9a1a276ffe52174311784ef11c8024f436fa5"} Mar 20 16:15:45 crc kubenswrapper[4708]: I0320 16:15:45.129344 4708 generic.go:334] "Generic (PLEG): container finished" podID="199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" containerID="9cd5872b89c388a84301eeb19fa8d7f9c633a57d55b4b17d7e81c861e0d47e31" exitCode=0 Mar 20 16:15:45 crc kubenswrapper[4708]: I0320 16:15:45.129401 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" event={"ID":"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2","Type":"ContainerDied","Data":"9cd5872b89c388a84301eeb19fa8d7f9c633a57d55b4b17d7e81c861e0d47e31"} Mar 20 16:15:46 crc kubenswrapper[4708]: I0320 16:15:46.342261 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:15:46 crc kubenswrapper[4708]: I0320 16:15:46.537715 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-bundle\") pod \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\" (UID: \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\") " Mar 20 16:15:46 crc kubenswrapper[4708]: I0320 16:15:46.537759 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfl2h\" (UniqueName: \"kubernetes.io/projected/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-kube-api-access-lfl2h\") pod \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\" (UID: \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\") " Mar 20 16:15:46 crc kubenswrapper[4708]: I0320 16:15:46.537923 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-util\") pod \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\" (UID: \"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2\") " Mar 20 16:15:46 crc kubenswrapper[4708]: I0320 16:15:46.538815 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-bundle" (OuterVolumeSpecName: "bundle") pod "199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" (UID: "199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:15:46 crc kubenswrapper[4708]: I0320 16:15:46.542913 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-kube-api-access-lfl2h" (OuterVolumeSpecName: "kube-api-access-lfl2h") pod "199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" (UID: "199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2"). InnerVolumeSpecName "kube-api-access-lfl2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:15:46 crc kubenswrapper[4708]: I0320 16:15:46.549803 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-util" (OuterVolumeSpecName: "util") pod "199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" (UID: "199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:15:46 crc kubenswrapper[4708]: I0320 16:15:46.550886 4708 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-util\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:46 crc kubenswrapper[4708]: I0320 16:15:46.551115 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfl2h\" (UniqueName: \"kubernetes.io/projected/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-kube-api-access-lfl2h\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:46 crc kubenswrapper[4708]: I0320 16:15:46.551139 4708 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:15:47 crc kubenswrapper[4708]: I0320 16:15:47.143536 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" event={"ID":"199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2","Type":"ContainerDied","Data":"0bb4096ec883cb9c1298461fd746c99cb2d01de0dba42073cf63f04df1c4e640"} Mar 20 16:15:47 crc kubenswrapper[4708]: I0320 16:15:47.143580 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bb4096ec883cb9c1298461fd746c99cb2d01de0dba42073cf63f04df1c4e640" Mar 20 16:15:47 crc kubenswrapper[4708]: I0320 16:15:47.143603 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.080361 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv"] Mar 20 16:16:00 crc kubenswrapper[4708]: E0320 16:16:00.080968 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" containerName="extract" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.080985 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" containerName="extract" Mar 20 16:16:00 crc kubenswrapper[4708]: E0320 16:16:00.081004 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77884518-e4d9-4a61-b8fb-55b1e2f9e23a" containerName="console" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.081011 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="77884518-e4d9-4a61-b8fb-55b1e2f9e23a" containerName="console" Mar 20 16:16:00 crc kubenswrapper[4708]: E0320 16:16:00.081026 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" containerName="pull" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.081035 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" containerName="pull" Mar 20 16:16:00 crc kubenswrapper[4708]: E0320 16:16:00.081053 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" containerName="util" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.081059 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" containerName="util" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.081157 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="77884518-e4d9-4a61-b8fb-55b1e2f9e23a" containerName="console" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.081168 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2" containerName="extract" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.081566 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.084469 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.084785 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.085307 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.086574 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.087056 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-hpwzz" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.102390 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv"] Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.176566 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567056-rbg8m"] Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.177463 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-rbg8m" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.179855 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.180274 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.180513 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.185384 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-rbg8m"] Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.217785 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95dd2919-dc89-4679-a48e-873f255af21e-webhook-cert\") pod \"metallb-operator-controller-manager-6687cdd9c4-rqtvv\" (UID: \"95dd2919-dc89-4679-a48e-873f255af21e\") " pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.217862 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95dd2919-dc89-4679-a48e-873f255af21e-apiservice-cert\") pod \"metallb-operator-controller-manager-6687cdd9c4-rqtvv\" (UID: \"95dd2919-dc89-4679-a48e-873f255af21e\") " pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.217890 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b58mv\" (UniqueName: \"kubernetes.io/projected/95dd2919-dc89-4679-a48e-873f255af21e-kube-api-access-b58mv\") pod \"metallb-operator-controller-manager-6687cdd9c4-rqtvv\" (UID: \"95dd2919-dc89-4679-a48e-873f255af21e\") " pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.319933 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95dd2919-dc89-4679-a48e-873f255af21e-webhook-cert\") pod \"metallb-operator-controller-manager-6687cdd9c4-rqtvv\" (UID: \"95dd2919-dc89-4679-a48e-873f255af21e\") " pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.320078 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95dd2919-dc89-4679-a48e-873f255af21e-apiservice-cert\") pod \"metallb-operator-controller-manager-6687cdd9c4-rqtvv\" (UID: \"95dd2919-dc89-4679-a48e-873f255af21e\") " pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.320486 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b58mv\" (UniqueName: \"kubernetes.io/projected/95dd2919-dc89-4679-a48e-873f255af21e-kube-api-access-b58mv\") pod \"metallb-operator-controller-manager-6687cdd9c4-rqtvv\" (UID: \"95dd2919-dc89-4679-a48e-873f255af21e\") " pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.321396 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwn6w\" (UniqueName: \"kubernetes.io/projected/2014e205-8eb6-4dc9-8fa7-f0935f73019d-kube-api-access-pwn6w\") pod \"auto-csr-approver-29567056-rbg8m\" (UID: \"2014e205-8eb6-4dc9-8fa7-f0935f73019d\") " pod="openshift-infra/auto-csr-approver-29567056-rbg8m" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.333050 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95dd2919-dc89-4679-a48e-873f255af21e-webhook-cert\") pod \"metallb-operator-controller-manager-6687cdd9c4-rqtvv\" (UID: \"95dd2919-dc89-4679-a48e-873f255af21e\") " pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.333050 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95dd2919-dc89-4679-a48e-873f255af21e-apiservice-cert\") pod \"metallb-operator-controller-manager-6687cdd9c4-rqtvv\" (UID: \"95dd2919-dc89-4679-a48e-873f255af21e\") " pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.345032 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b58mv\" (UniqueName: \"kubernetes.io/projected/95dd2919-dc89-4679-a48e-873f255af21e-kube-api-access-b58mv\") pod \"metallb-operator-controller-manager-6687cdd9c4-rqtvv\" (UID: \"95dd2919-dc89-4679-a48e-873f255af21e\") " pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.401906 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.422063 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwn6w\" (UniqueName: \"kubernetes.io/projected/2014e205-8eb6-4dc9-8fa7-f0935f73019d-kube-api-access-pwn6w\") pod \"auto-csr-approver-29567056-rbg8m\" (UID: \"2014e205-8eb6-4dc9-8fa7-f0935f73019d\") " pod="openshift-infra/auto-csr-approver-29567056-rbg8m" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.431511 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm"] Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.432320 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.434952 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.435349 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.435526 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-mcz7b" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.446544 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwn6w\" (UniqueName: \"kubernetes.io/projected/2014e205-8eb6-4dc9-8fa7-f0935f73019d-kube-api-access-pwn6w\") pod \"auto-csr-approver-29567056-rbg8m\" (UID: \"2014e205-8eb6-4dc9-8fa7-f0935f73019d\") " pod="openshift-infra/auto-csr-approver-29567056-rbg8m" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.474455 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm"] Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.491390 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-rbg8m" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.524759 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52725a1b-5472-4e3a-8b88-4ed17ed3c44c-apiservice-cert\") pod \"metallb-operator-webhook-server-7f6b8f677-57vgm\" (UID: \"52725a1b-5472-4e3a-8b88-4ed17ed3c44c\") " pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.524817 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52725a1b-5472-4e3a-8b88-4ed17ed3c44c-webhook-cert\") pod \"metallb-operator-webhook-server-7f6b8f677-57vgm\" (UID: \"52725a1b-5472-4e3a-8b88-4ed17ed3c44c\") " pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.524876 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vt62\" (UniqueName: \"kubernetes.io/projected/52725a1b-5472-4e3a-8b88-4ed17ed3c44c-kube-api-access-2vt62\") pod \"metallb-operator-webhook-server-7f6b8f677-57vgm\" (UID: \"52725a1b-5472-4e3a-8b88-4ed17ed3c44c\") " pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.626277 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52725a1b-5472-4e3a-8b88-4ed17ed3c44c-webhook-cert\") pod \"metallb-operator-webhook-server-7f6b8f677-57vgm\" (UID: \"52725a1b-5472-4e3a-8b88-4ed17ed3c44c\") " pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.627370 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vt62\" (UniqueName: \"kubernetes.io/projected/52725a1b-5472-4e3a-8b88-4ed17ed3c44c-kube-api-access-2vt62\") pod \"metallb-operator-webhook-server-7f6b8f677-57vgm\" (UID: \"52725a1b-5472-4e3a-8b88-4ed17ed3c44c\") " pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.627660 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52725a1b-5472-4e3a-8b88-4ed17ed3c44c-apiservice-cert\") pod \"metallb-operator-webhook-server-7f6b8f677-57vgm\" (UID: \"52725a1b-5472-4e3a-8b88-4ed17ed3c44c\") " pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.632464 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52725a1b-5472-4e3a-8b88-4ed17ed3c44c-webhook-cert\") pod \"metallb-operator-webhook-server-7f6b8f677-57vgm\" (UID: \"52725a1b-5472-4e3a-8b88-4ed17ed3c44c\") " pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.632508 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52725a1b-5472-4e3a-8b88-4ed17ed3c44c-apiservice-cert\") pod \"metallb-operator-webhook-server-7f6b8f677-57vgm\" (UID: \"52725a1b-5472-4e3a-8b88-4ed17ed3c44c\") " pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.652395 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vt62\" (UniqueName: \"kubernetes.io/projected/52725a1b-5472-4e3a-8b88-4ed17ed3c44c-kube-api-access-2vt62\") pod \"metallb-operator-webhook-server-7f6b8f677-57vgm\" (UID: \"52725a1b-5472-4e3a-8b88-4ed17ed3c44c\") " pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.766172 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv"] Mar 20 16:16:00 crc kubenswrapper[4708]: I0320 16:16:00.775973 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:01 crc kubenswrapper[4708]: W0320 16:16:01.040966 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2014e205_8eb6_4dc9_8fa7_f0935f73019d.slice/crio-4ba5643415a022bc44d8a57c5c798255217d604053495b1cac88a66d835e2997 WatchSource:0}: Error finding container 4ba5643415a022bc44d8a57c5c798255217d604053495b1cac88a66d835e2997: Status 404 returned error can't find the container with id 4ba5643415a022bc44d8a57c5c798255217d604053495b1cac88a66d835e2997 Mar 20 16:16:01 crc kubenswrapper[4708]: I0320 16:16:01.042770 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-rbg8m"] Mar 20 16:16:01 crc kubenswrapper[4708]: I0320 16:16:01.068093 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm"] Mar 20 16:16:01 crc kubenswrapper[4708]: W0320 16:16:01.068655 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52725a1b_5472_4e3a_8b88_4ed17ed3c44c.slice/crio-3060ac385e2b42d034995b8e39f32a67f56dcbe3b82fb607734770d2a8b238ee WatchSource:0}: Error finding container 3060ac385e2b42d034995b8e39f32a67f56dcbe3b82fb607734770d2a8b238ee: Status 404 returned error can't find the container with id 3060ac385e2b42d034995b8e39f32a67f56dcbe3b82fb607734770d2a8b238ee Mar 20 16:16:01 crc kubenswrapper[4708]: I0320 16:16:01.224530 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567056-rbg8m" event={"ID":"2014e205-8eb6-4dc9-8fa7-f0935f73019d","Type":"ContainerStarted","Data":"4ba5643415a022bc44d8a57c5c798255217d604053495b1cac88a66d835e2997"} Mar 20 16:16:01 crc kubenswrapper[4708]: I0320 16:16:01.225562 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" event={"ID":"95dd2919-dc89-4679-a48e-873f255af21e","Type":"ContainerStarted","Data":"7e3141ce1cbbff9836698adefdcf00a5a32b3a63a95b0b6ed8418791cd6e4947"} Mar 20 16:16:01 crc kubenswrapper[4708]: I0320 16:16:01.228655 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" event={"ID":"52725a1b-5472-4e3a-8b88-4ed17ed3c44c","Type":"ContainerStarted","Data":"3060ac385e2b42d034995b8e39f32a67f56dcbe3b82fb607734770d2a8b238ee"} Mar 20 16:16:03 crc kubenswrapper[4708]: I0320 16:16:03.250040 4708 generic.go:334] "Generic (PLEG): container finished" podID="2014e205-8eb6-4dc9-8fa7-f0935f73019d" containerID="419a0b72c831d2af0889a6f5356970735fbd438b4513aef69a146e4d0e01dea4" exitCode=0 Mar 20 16:16:03 crc kubenswrapper[4708]: I0320 16:16:03.250194 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567056-rbg8m" event={"ID":"2014e205-8eb6-4dc9-8fa7-f0935f73019d","Type":"ContainerDied","Data":"419a0b72c831d2af0889a6f5356970735fbd438b4513aef69a146e4d0e01dea4"} Mar 20 16:16:04 crc kubenswrapper[4708]: I0320 16:16:04.532278 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-rbg8m" Mar 20 16:16:04 crc kubenswrapper[4708]: I0320 16:16:04.686873 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwn6w\" (UniqueName: \"kubernetes.io/projected/2014e205-8eb6-4dc9-8fa7-f0935f73019d-kube-api-access-pwn6w\") pod \"2014e205-8eb6-4dc9-8fa7-f0935f73019d\" (UID: \"2014e205-8eb6-4dc9-8fa7-f0935f73019d\") " Mar 20 16:16:04 crc kubenswrapper[4708]: I0320 16:16:04.693621 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2014e205-8eb6-4dc9-8fa7-f0935f73019d-kube-api-access-pwn6w" (OuterVolumeSpecName: "kube-api-access-pwn6w") pod "2014e205-8eb6-4dc9-8fa7-f0935f73019d" (UID: "2014e205-8eb6-4dc9-8fa7-f0935f73019d"). InnerVolumeSpecName "kube-api-access-pwn6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:16:04 crc kubenswrapper[4708]: I0320 16:16:04.788444 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwn6w\" (UniqueName: \"kubernetes.io/projected/2014e205-8eb6-4dc9-8fa7-f0935f73019d-kube-api-access-pwn6w\") on node \"crc\" DevicePath \"\"" Mar 20 16:16:05 crc kubenswrapper[4708]: I0320 16:16:05.265394 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567056-rbg8m" event={"ID":"2014e205-8eb6-4dc9-8fa7-f0935f73019d","Type":"ContainerDied","Data":"4ba5643415a022bc44d8a57c5c798255217d604053495b1cac88a66d835e2997"} Mar 20 16:16:05 crc kubenswrapper[4708]: I0320 16:16:05.265430 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ba5643415a022bc44d8a57c5c798255217d604053495b1cac88a66d835e2997" Mar 20 16:16:05 crc kubenswrapper[4708]: I0320 16:16:05.265480 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-rbg8m" Mar 20 16:16:05 crc kubenswrapper[4708]: I0320 16:16:05.583116 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-gqgxj"] Mar 20 16:16:05 crc kubenswrapper[4708]: I0320 16:16:05.587832 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-gqgxj"] Mar 20 16:16:06 crc kubenswrapper[4708]: I0320 16:16:06.120893 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ddd964-0a24-464f-82f0-99152d1b7839" path="/var/lib/kubelet/pods/e1ddd964-0a24-464f-82f0-99152d1b7839/volumes" Mar 20 16:16:07 crc kubenswrapper[4708]: I0320 16:16:07.280016 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" event={"ID":"52725a1b-5472-4e3a-8b88-4ed17ed3c44c","Type":"ContainerStarted","Data":"f00827938ae24520299b255fb801f1f6a8480fe77e7b7d8c9b34c8200fae0766"} Mar 20 16:16:07 crc kubenswrapper[4708]: I0320 16:16:07.280379 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:07 crc kubenswrapper[4708]: I0320 16:16:07.281448 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" event={"ID":"95dd2919-dc89-4679-a48e-873f255af21e","Type":"ContainerStarted","Data":"5da3aae707803594286d644bfa533f099676114b93d106d40dcc4cf6eaa73edc"} Mar 20 16:16:07 crc kubenswrapper[4708]: I0320 16:16:07.281579 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:07 crc kubenswrapper[4708]: I0320 16:16:07.295710 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" podStartSLOduration=1.276819121 podStartE2EDuration="7.295692249s" podCreationTimestamp="2026-03-20 16:16:00 +0000 UTC" firstStartedPulling="2026-03-20 16:16:01.071481291 +0000 UTC m=+915.745818006" lastFinishedPulling="2026-03-20 16:16:07.090354419 +0000 UTC m=+921.764691134" observedRunningTime="2026-03-20 16:16:07.294599189 +0000 UTC m=+921.968935904" watchObservedRunningTime="2026-03-20 16:16:07.295692249 +0000 UTC m=+921.970028964" Mar 20 16:16:07 crc kubenswrapper[4708]: I0320 16:16:07.317601 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" podStartSLOduration=1.027306847 podStartE2EDuration="7.317579864s" podCreationTimestamp="2026-03-20 16:16:00 +0000 UTC" firstStartedPulling="2026-03-20 16:16:00.777562599 +0000 UTC m=+915.451899304" lastFinishedPulling="2026-03-20 16:16:07.067835606 +0000 UTC m=+921.742172321" observedRunningTime="2026-03-20 16:16:07.316354072 +0000 UTC m=+921.990690787" watchObservedRunningTime="2026-03-20 16:16:07.317579864 +0000 UTC m=+921.991916609" Mar 20 16:16:11 crc kubenswrapper[4708]: I0320 16:16:11.266849 4708 scope.go:117] "RemoveContainer" containerID="f470bd0f3ccfa0a28663b9f7725ba6373d4c96ab2ee74fca2fe3e04bd0b68f80" Mar 20 16:16:20 crc kubenswrapper[4708]: I0320 16:16:20.972406 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7f6b8f677-57vgm" Mar 20 16:16:40 crc kubenswrapper[4708]: I0320 16:16:40.404778 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6687cdd9c4-rqtvv" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.057654 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-82c7x"] Mar 20 16:16:41 crc kubenswrapper[4708]: E0320 16:16:41.058064 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2014e205-8eb6-4dc9-8fa7-f0935f73019d" containerName="oc" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.058085 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="2014e205-8eb6-4dc9-8fa7-f0935f73019d" containerName="oc" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.058217 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="2014e205-8eb6-4dc9-8fa7-f0935f73019d" containerName="oc" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.060776 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.062693 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-7dwlh" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.063243 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.063726 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.074785 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m"] Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.075557 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.077573 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.081531 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m"] Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.091124 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49jxb\" (UniqueName: \"kubernetes.io/projected/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-kube-api-access-49jxb\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.091177 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb237cd-490a-4b0b-9e60-e52df43516af-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-b542m\" (UID: \"fbb237cd-490a-4b0b-9e60-e52df43516af\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.091205 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-frr-startup\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.091251 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-metrics\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.091292 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-metrics-certs\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.091312 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-frr-conf\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.091335 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-reloader\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.091359 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-frr-sockets\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.091381 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trsfr\" (UniqueName: \"kubernetes.io/projected/fbb237cd-490a-4b0b-9e60-e52df43516af-kube-api-access-trsfr\") pod \"frr-k8s-webhook-server-bcc4b6f68-b542m\" (UID: \"fbb237cd-490a-4b0b-9e60-e52df43516af\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.155973 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wddvr"] Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.156940 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.164853 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.167156 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-mqtxd"] Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.168018 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.171276 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.171590 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.171658 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.171601 4708 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-t6m8k" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.185409 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-mqtxd"] Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192325 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb237cd-490a-4b0b-9e60-e52df43516af-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-b542m\" (UID: \"fbb237cd-490a-4b0b-9e60-e52df43516af\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192385 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-frr-startup\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192422 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-memberlist\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192450 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvhzq\" (UniqueName: \"kubernetes.io/projected/4a141870-780d-49d9-b58a-6cbeed6b9000-kube-api-access-mvhzq\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192490 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jltcb\" (UniqueName: \"kubernetes.io/projected/ad48fc57-2a20-4652-bf02-e0250154f5a2-kube-api-access-jltcb\") pod \"controller-7bb4cc7c98-mqtxd\" (UID: \"ad48fc57-2a20-4652-bf02-e0250154f5a2\") " pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192527 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-metrics\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192569 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad48fc57-2a20-4652-bf02-e0250154f5a2-cert\") pod \"controller-7bb4cc7c98-mqtxd\" (UID: \"ad48fc57-2a20-4652-bf02-e0250154f5a2\") " pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192592 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad48fc57-2a20-4652-bf02-e0250154f5a2-metrics-certs\") pod \"controller-7bb4cc7c98-mqtxd\" (UID: \"ad48fc57-2a20-4652-bf02-e0250154f5a2\") " pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192620 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-metrics-certs\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192645 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-frr-conf\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192697 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-reloader\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192730 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-frr-sockets\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192761 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trsfr\" (UniqueName: \"kubernetes.io/projected/fbb237cd-490a-4b0b-9e60-e52df43516af-kube-api-access-trsfr\") pod \"frr-k8s-webhook-server-bcc4b6f68-b542m\" (UID: \"fbb237cd-490a-4b0b-9e60-e52df43516af\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192788 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49jxb\" (UniqueName: \"kubernetes.io/projected/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-kube-api-access-49jxb\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192815 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4a141870-780d-49d9-b58a-6cbeed6b9000-metallb-excludel2\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.192846 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-metrics-certs\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: E0320 16:16:41.193010 4708 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 20 16:16:41 crc kubenswrapper[4708]: E0320 16:16:41.193062 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbb237cd-490a-4b0b-9e60-e52df43516af-cert podName:fbb237cd-490a-4b0b-9e60-e52df43516af nodeName:}" failed. No retries permitted until 2026-03-20 16:16:41.693043489 +0000 UTC m=+956.367380204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fbb237cd-490a-4b0b-9e60-e52df43516af-cert") pod "frr-k8s-webhook-server-bcc4b6f68-b542m" (UID: "fbb237cd-490a-4b0b-9e60-e52df43516af") : secret "frr-k8s-webhook-server-cert" not found Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.194237 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-frr-startup\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.194515 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-metrics\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.195869 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-reloader\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.195881 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-frr-conf\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.195983 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-frr-sockets\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.200775 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-metrics-certs\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.215628 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trsfr\" (UniqueName: \"kubernetes.io/projected/fbb237cd-490a-4b0b-9e60-e52df43516af-kube-api-access-trsfr\") pod \"frr-k8s-webhook-server-bcc4b6f68-b542m\" (UID: \"fbb237cd-490a-4b0b-9e60-e52df43516af\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.225262 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49jxb\" (UniqueName: \"kubernetes.io/projected/03f3b7c8-445f-4ef1-9920-fa75d2fcd0be-kube-api-access-49jxb\") pod \"frr-k8s-82c7x\" (UID: \"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be\") " pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.293840 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad48fc57-2a20-4652-bf02-e0250154f5a2-cert\") pod \"controller-7bb4cc7c98-mqtxd\" (UID: \"ad48fc57-2a20-4652-bf02-e0250154f5a2\") " pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.293893 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad48fc57-2a20-4652-bf02-e0250154f5a2-metrics-certs\") pod \"controller-7bb4cc7c98-mqtxd\" (UID: \"ad48fc57-2a20-4652-bf02-e0250154f5a2\") " pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.293941 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4a141870-780d-49d9-b58a-6cbeed6b9000-metallb-excludel2\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.293970 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-metrics-certs\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.294006 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-memberlist\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.294026 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jltcb\" (UniqueName: \"kubernetes.io/projected/ad48fc57-2a20-4652-bf02-e0250154f5a2-kube-api-access-jltcb\") pod \"controller-7bb4cc7c98-mqtxd\" (UID: \"ad48fc57-2a20-4652-bf02-e0250154f5a2\") " pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.294052 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvhzq\" (UniqueName: \"kubernetes.io/projected/4a141870-780d-49d9-b58a-6cbeed6b9000-kube-api-access-mvhzq\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: E0320 16:16:41.294110 4708 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 20 16:16:41 crc kubenswrapper[4708]: E0320 16:16:41.294163 4708 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 20 16:16:41 crc kubenswrapper[4708]: E0320 16:16:41.294246 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad48fc57-2a20-4652-bf02-e0250154f5a2-metrics-certs podName:ad48fc57-2a20-4652-bf02-e0250154f5a2 nodeName:}" failed. No retries permitted until 2026-03-20 16:16:41.794193263 +0000 UTC m=+956.468530038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad48fc57-2a20-4652-bf02-e0250154f5a2-metrics-certs") pod "controller-7bb4cc7c98-mqtxd" (UID: "ad48fc57-2a20-4652-bf02-e0250154f5a2") : secret "controller-certs-secret" not found Mar 20 16:16:41 crc kubenswrapper[4708]: E0320 16:16:41.294290 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-metrics-certs podName:4a141870-780d-49d9-b58a-6cbeed6b9000 nodeName:}" failed. No retries permitted until 2026-03-20 16:16:41.794279575 +0000 UTC m=+956.468616290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-metrics-certs") pod "speaker-wddvr" (UID: "4a141870-780d-49d9-b58a-6cbeed6b9000") : secret "speaker-certs-secret" not found Mar 20 16:16:41 crc kubenswrapper[4708]: E0320 16:16:41.294452 4708 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 16:16:41 crc kubenswrapper[4708]: E0320 16:16:41.294500 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-memberlist podName:4a141870-780d-49d9-b58a-6cbeed6b9000 nodeName:}" failed. No retries permitted until 2026-03-20 16:16:41.79448577 +0000 UTC m=+956.468822485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-memberlist") pod "speaker-wddvr" (UID: "4a141870-780d-49d9-b58a-6cbeed6b9000") : secret "metallb-memberlist" not found Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.294874 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4a141870-780d-49d9-b58a-6cbeed6b9000-metallb-excludel2\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.303188 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad48fc57-2a20-4652-bf02-e0250154f5a2-cert\") pod \"controller-7bb4cc7c98-mqtxd\" (UID: \"ad48fc57-2a20-4652-bf02-e0250154f5a2\") " pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.311605 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jltcb\" (UniqueName: \"kubernetes.io/projected/ad48fc57-2a20-4652-bf02-e0250154f5a2-kube-api-access-jltcb\") pod \"controller-7bb4cc7c98-mqtxd\" (UID: \"ad48fc57-2a20-4652-bf02-e0250154f5a2\") " pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.312413 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvhzq\" (UniqueName: \"kubernetes.io/projected/4a141870-780d-49d9-b58a-6cbeed6b9000-kube-api-access-mvhzq\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.378761 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-82c7x" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.698557 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb237cd-490a-4b0b-9e60-e52df43516af-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-b542m\" (UID: \"fbb237cd-490a-4b0b-9e60-e52df43516af\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.701727 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb237cd-490a-4b0b-9e60-e52df43516af-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-b542m\" (UID: \"fbb237cd-490a-4b0b-9e60-e52df43516af\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.800450 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad48fc57-2a20-4652-bf02-e0250154f5a2-metrics-certs\") pod \"controller-7bb4cc7c98-mqtxd\" (UID: \"ad48fc57-2a20-4652-bf02-e0250154f5a2\") " pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.800555 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-metrics-certs\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.800688 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-memberlist\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: E0320 16:16:41.800840 4708 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 16:16:41 crc kubenswrapper[4708]: E0320 16:16:41.800922 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-memberlist podName:4a141870-780d-49d9-b58a-6cbeed6b9000 nodeName:}" failed. No retries permitted until 2026-03-20 16:16:42.800908167 +0000 UTC m=+957.475244882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-memberlist") pod "speaker-wddvr" (UID: "4a141870-780d-49d9-b58a-6cbeed6b9000") : secret "metallb-memberlist" not found Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.803991 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-metrics-certs\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.804275 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad48fc57-2a20-4652-bf02-e0250154f5a2-metrics-certs\") pod \"controller-7bb4cc7c98-mqtxd\" (UID: \"ad48fc57-2a20-4652-bf02-e0250154f5a2\") " pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:41 crc kubenswrapper[4708]: I0320 16:16:41.994420 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" Mar 20 16:16:42 crc kubenswrapper[4708]: I0320 16:16:42.087244 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:42 crc kubenswrapper[4708]: I0320 16:16:42.500017 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82c7x" event={"ID":"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be","Type":"ContainerStarted","Data":"c16863eb5dcaa6b6e5db749fcc167a5b715b8a9f3dde7ee53cc933e68ed1e0a1"} Mar 20 16:16:42 crc kubenswrapper[4708]: I0320 16:16:42.655336 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m"] Mar 20 16:16:42 crc kubenswrapper[4708]: I0320 16:16:42.699363 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-mqtxd"] Mar 20 16:16:42 crc kubenswrapper[4708]: W0320 16:16:42.701884 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad48fc57_2a20_4652_bf02_e0250154f5a2.slice/crio-c6636a6d16ad0c3b48f2eb914b13769c34ec40adcffc85ea593fdc653013aaaa WatchSource:0}: Error finding container c6636a6d16ad0c3b48f2eb914b13769c34ec40adcffc85ea593fdc653013aaaa: Status 404 returned error can't find the container with id c6636a6d16ad0c3b48f2eb914b13769c34ec40adcffc85ea593fdc653013aaaa Mar 20 16:16:42 crc kubenswrapper[4708]: I0320 16:16:42.833598 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-memberlist\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:42 crc kubenswrapper[4708]: E0320 16:16:42.833797 4708 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 16:16:42 crc kubenswrapper[4708]: E0320 16:16:42.834103 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-memberlist podName:4a141870-780d-49d9-b58a-6cbeed6b9000 nodeName:}" failed. No retries permitted until 2026-03-20 16:16:44.834084892 +0000 UTC m=+959.508421607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-memberlist") pod "speaker-wddvr" (UID: "4a141870-780d-49d9-b58a-6cbeed6b9000") : secret "metallb-memberlist" not found Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.039359 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vzbj6"] Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.040588 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.063351 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vzbj6"] Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.239497 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bl4w\" (UniqueName: \"kubernetes.io/projected/4669de98-fd7a-48da-bb9c-5baf3d20e103-kube-api-access-7bl4w\") pod \"community-operators-vzbj6\" (UID: \"4669de98-fd7a-48da-bb9c-5baf3d20e103\") " pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.239573 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4669de98-fd7a-48da-bb9c-5baf3d20e103-utilities\") pod \"community-operators-vzbj6\" (UID: \"4669de98-fd7a-48da-bb9c-5baf3d20e103\") " pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.239613 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4669de98-fd7a-48da-bb9c-5baf3d20e103-catalog-content\") pod \"community-operators-vzbj6\" (UID: \"4669de98-fd7a-48da-bb9c-5baf3d20e103\") " pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.340975 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bl4w\" (UniqueName: \"kubernetes.io/projected/4669de98-fd7a-48da-bb9c-5baf3d20e103-kube-api-access-7bl4w\") pod \"community-operators-vzbj6\" (UID: \"4669de98-fd7a-48da-bb9c-5baf3d20e103\") " pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.341252 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4669de98-fd7a-48da-bb9c-5baf3d20e103-utilities\") pod \"community-operators-vzbj6\" (UID: \"4669de98-fd7a-48da-bb9c-5baf3d20e103\") " pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.341392 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4669de98-fd7a-48da-bb9c-5baf3d20e103-catalog-content\") pod \"community-operators-vzbj6\" (UID: \"4669de98-fd7a-48da-bb9c-5baf3d20e103\") " pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.341776 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4669de98-fd7a-48da-bb9c-5baf3d20e103-utilities\") pod \"community-operators-vzbj6\" (UID: \"4669de98-fd7a-48da-bb9c-5baf3d20e103\") " pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.341812 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4669de98-fd7a-48da-bb9c-5baf3d20e103-catalog-content\") pod \"community-operators-vzbj6\" (UID: \"4669de98-fd7a-48da-bb9c-5baf3d20e103\") " pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.367011 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bl4w\" (UniqueName: \"kubernetes.io/projected/4669de98-fd7a-48da-bb9c-5baf3d20e103-kube-api-access-7bl4w\") pod \"community-operators-vzbj6\" (UID: \"4669de98-fd7a-48da-bb9c-5baf3d20e103\") " pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.510129 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" event={"ID":"fbb237cd-490a-4b0b-9e60-e52df43516af","Type":"ContainerStarted","Data":"cfb13691b3efb35cecd989877d7b55410991b278cf9c56d8d417d856dccda302"} Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.517322 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-mqtxd" event={"ID":"ad48fc57-2a20-4652-bf02-e0250154f5a2","Type":"ContainerStarted","Data":"b29236fc73e493a248443d9b6c1c82c156fd242be3710f6e0a272541ae3787c5"} Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.517366 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-mqtxd" event={"ID":"ad48fc57-2a20-4652-bf02-e0250154f5a2","Type":"ContainerStarted","Data":"770b5e0aa5aaae0439305062200fd36e1522a5615385165633f58dfa8f1fb952"} Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.517377 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-mqtxd" event={"ID":"ad48fc57-2a20-4652-bf02-e0250154f5a2","Type":"ContainerStarted","Data":"c6636a6d16ad0c3b48f2eb914b13769c34ec40adcffc85ea593fdc653013aaaa"} Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.518331 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.540858 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-mqtxd" podStartSLOduration=2.540843401 podStartE2EDuration="2.540843401s" podCreationTimestamp="2026-03-20 16:16:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:16:43.538398255 +0000 UTC m=+958.212734980" watchObservedRunningTime="2026-03-20 16:16:43.540843401 +0000 UTC m=+958.215180116" Mar 20 16:16:43 crc kubenswrapper[4708]: I0320 16:16:43.658793 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:16:44 crc kubenswrapper[4708]: I0320 16:16:44.063951 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vzbj6"] Mar 20 16:16:44 crc kubenswrapper[4708]: I0320 16:16:44.524866 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzbj6" event={"ID":"4669de98-fd7a-48da-bb9c-5baf3d20e103","Type":"ContainerStarted","Data":"a260020724ab806c3bb4bf04cb55d0cf66687825aa3d0c1906c844a95229a027"} Mar 20 16:16:44 crc kubenswrapper[4708]: I0320 16:16:44.864164 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-memberlist\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:44 crc kubenswrapper[4708]: I0320 16:16:44.871920 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4a141870-780d-49d9-b58a-6cbeed6b9000-memberlist\") pod \"speaker-wddvr\" (UID: \"4a141870-780d-49d9-b58a-6cbeed6b9000\") " pod="metallb-system/speaker-wddvr" Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.079232 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wddvr" Mar 20 16:16:45 crc kubenswrapper[4708]: W0320 16:16:45.134242 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a141870_780d_49d9_b58a_6cbeed6b9000.slice/crio-e4c506923c51240349cdbce22bd152058ed8e9fe03b7402752c1952ece77ddab WatchSource:0}: Error finding container e4c506923c51240349cdbce22bd152058ed8e9fe03b7402752c1952ece77ddab: Status 404 returned error can't find the container with id e4c506923c51240349cdbce22bd152058ed8e9fe03b7402752c1952ece77ddab Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.541794 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wddvr" event={"ID":"4a141870-780d-49d9-b58a-6cbeed6b9000","Type":"ContainerStarted","Data":"a53f9ea540d9e903c6fe707fde544f681afb79890aedf6249dad1fb0d178df95"} Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.542537 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wddvr" event={"ID":"4a141870-780d-49d9-b58a-6cbeed6b9000","Type":"ContainerStarted","Data":"e4c506923c51240349cdbce22bd152058ed8e9fe03b7402752c1952ece77ddab"} Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.546010 4708 generic.go:334] "Generic (PLEG): container finished" podID="4669de98-fd7a-48da-bb9c-5baf3d20e103" containerID="6183b894080b56c93a61ae1a865dc872711ce3196ca59c66afb6c6901b33c988" exitCode=0 Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.547475 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzbj6" event={"ID":"4669de98-fd7a-48da-bb9c-5baf3d20e103","Type":"ContainerDied","Data":"6183b894080b56c93a61ae1a865dc872711ce3196ca59c66afb6c6901b33c988"} Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.777111 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r9qhs"] Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.778484 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.798457 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9qhs"] Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.876301 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zrlk\" (UniqueName: \"kubernetes.io/projected/45595f80-1c03-4668-a1f4-869ff7f5bbea-kube-api-access-9zrlk\") pod \"redhat-marketplace-r9qhs\" (UID: \"45595f80-1c03-4668-a1f4-869ff7f5bbea\") " pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.876356 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45595f80-1c03-4668-a1f4-869ff7f5bbea-utilities\") pod \"redhat-marketplace-r9qhs\" (UID: \"45595f80-1c03-4668-a1f4-869ff7f5bbea\") " pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.876413 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45595f80-1c03-4668-a1f4-869ff7f5bbea-catalog-content\") pod \"redhat-marketplace-r9qhs\" (UID: \"45595f80-1c03-4668-a1f4-869ff7f5bbea\") " pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.978027 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zrlk\" (UniqueName: \"kubernetes.io/projected/45595f80-1c03-4668-a1f4-869ff7f5bbea-kube-api-access-9zrlk\") pod \"redhat-marketplace-r9qhs\" (UID: \"45595f80-1c03-4668-a1f4-869ff7f5bbea\") " pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.978091 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45595f80-1c03-4668-a1f4-869ff7f5bbea-utilities\") pod \"redhat-marketplace-r9qhs\" (UID: \"45595f80-1c03-4668-a1f4-869ff7f5bbea\") " pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.978148 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45595f80-1c03-4668-a1f4-869ff7f5bbea-catalog-content\") pod \"redhat-marketplace-r9qhs\" (UID: \"45595f80-1c03-4668-a1f4-869ff7f5bbea\") " pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.978748 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45595f80-1c03-4668-a1f4-869ff7f5bbea-catalog-content\") pod \"redhat-marketplace-r9qhs\" (UID: \"45595f80-1c03-4668-a1f4-869ff7f5bbea\") " pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:16:45 crc kubenswrapper[4708]: I0320 16:16:45.978908 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45595f80-1c03-4668-a1f4-869ff7f5bbea-utilities\") pod \"redhat-marketplace-r9qhs\" (UID: \"45595f80-1c03-4668-a1f4-869ff7f5bbea\") " pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:16:46 crc kubenswrapper[4708]: I0320 16:16:46.004208 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zrlk\" (UniqueName: \"kubernetes.io/projected/45595f80-1c03-4668-a1f4-869ff7f5bbea-kube-api-access-9zrlk\") pod \"redhat-marketplace-r9qhs\" (UID: \"45595f80-1c03-4668-a1f4-869ff7f5bbea\") " pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:16:46 crc kubenswrapper[4708]: I0320 16:16:46.120558 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:16:46 crc kubenswrapper[4708]: I0320 16:16:46.567306 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wddvr" event={"ID":"4a141870-780d-49d9-b58a-6cbeed6b9000","Type":"ContainerStarted","Data":"51bb2481e82c4f32431b8063b9c40fd0f4df7d72775845e0a178641ffbb84201"} Mar 20 16:16:46 crc kubenswrapper[4708]: I0320 16:16:46.567961 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wddvr" Mar 20 16:16:46 crc kubenswrapper[4708]: I0320 16:16:46.576787 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzbj6" event={"ID":"4669de98-fd7a-48da-bb9c-5baf3d20e103","Type":"ContainerStarted","Data":"d4df4f6ab127b8084424f430927738cd30b0f5132064e9eb131de0f0094c71b8"} Mar 20 16:16:46 crc kubenswrapper[4708]: I0320 16:16:46.585733 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wddvr" podStartSLOduration=5.58571565 podStartE2EDuration="5.58571565s" podCreationTimestamp="2026-03-20 16:16:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:16:46.583007256 +0000 UTC m=+961.257343981" watchObservedRunningTime="2026-03-20 16:16:46.58571565 +0000 UTC m=+961.260052365" Mar 20 16:16:46 crc kubenswrapper[4708]: I0320 16:16:46.649374 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9qhs"] Mar 20 16:16:47 crc kubenswrapper[4708]: I0320 16:16:47.584999 4708 generic.go:334] "Generic (PLEG): container finished" podID="4669de98-fd7a-48da-bb9c-5baf3d20e103" containerID="d4df4f6ab127b8084424f430927738cd30b0f5132064e9eb131de0f0094c71b8" exitCode=0 Mar 20 16:16:47 crc kubenswrapper[4708]: I0320 16:16:47.585265 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzbj6" event={"ID":"4669de98-fd7a-48da-bb9c-5baf3d20e103","Type":"ContainerDied","Data":"d4df4f6ab127b8084424f430927738cd30b0f5132064e9eb131de0f0094c71b8"} Mar 20 16:16:47 crc kubenswrapper[4708]: I0320 16:16:47.588174 4708 generic.go:334] "Generic (PLEG): container finished" podID="45595f80-1c03-4668-a1f4-869ff7f5bbea" containerID="41c0141f0659169a510907b5d0cdf35946ea9dc813059f945277235896f3f7b1" exitCode=0 Mar 20 16:16:47 crc kubenswrapper[4708]: I0320 16:16:47.588250 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9qhs" event={"ID":"45595f80-1c03-4668-a1f4-869ff7f5bbea","Type":"ContainerDied","Data":"41c0141f0659169a510907b5d0cdf35946ea9dc813059f945277235896f3f7b1"} Mar 20 16:16:47 crc kubenswrapper[4708]: I0320 16:16:47.588314 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9qhs" event={"ID":"45595f80-1c03-4668-a1f4-869ff7f5bbea","Type":"ContainerStarted","Data":"91bc7668387c9ee52cb11df8089728983e227115777a83e5f9be971ddf2e9a41"} Mar 20 16:16:52 crc kubenswrapper[4708]: I0320 16:16:52.092022 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-mqtxd" Mar 20 16:16:55 crc kubenswrapper[4708]: I0320 16:16:55.083624 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wddvr" Mar 20 16:16:56 crc kubenswrapper[4708]: I0320 16:16:56.178438 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:16:56 crc kubenswrapper[4708]: I0320 16:16:56.178552 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:16:56 crc kubenswrapper[4708]: I0320 16:16:56.646286 4708 generic.go:334] "Generic (PLEG): container finished" podID="45595f80-1c03-4668-a1f4-869ff7f5bbea" containerID="0e2e7884c806a50bf0c5af599e085e057846747495a1ce12b240261d127a15dd" exitCode=0 Mar 20 16:16:56 crc kubenswrapper[4708]: I0320 16:16:56.646340 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9qhs" event={"ID":"45595f80-1c03-4668-a1f4-869ff7f5bbea","Type":"ContainerDied","Data":"0e2e7884c806a50bf0c5af599e085e057846747495a1ce12b240261d127a15dd"} Mar 20 16:16:56 crc kubenswrapper[4708]: I0320 16:16:56.648738 4708 generic.go:334] "Generic (PLEG): container finished" podID="03f3b7c8-445f-4ef1-9920-fa75d2fcd0be" containerID="c90c25e4c943cd5ec669d507f4b11ff3527769783ca91dca1fce710866a7d4d3" exitCode=0 Mar 20 16:16:56 crc kubenswrapper[4708]: I0320 16:16:56.648785 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82c7x" event={"ID":"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be","Type":"ContainerDied","Data":"c90c25e4c943cd5ec669d507f4b11ff3527769783ca91dca1fce710866a7d4d3"} Mar 20 16:16:56 crc kubenswrapper[4708]: I0320 16:16:56.653073 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" event={"ID":"fbb237cd-490a-4b0b-9e60-e52df43516af","Type":"ContainerStarted","Data":"7d17093ae2fcd521a6cee0adea8158b484c1a0da758b7a3aae52beaa7835c903"} Mar 20 16:16:56 crc kubenswrapper[4708]: I0320 16:16:56.653212 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" Mar 20 16:16:56 crc kubenswrapper[4708]: I0320 16:16:56.659116 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzbj6" event={"ID":"4669de98-fd7a-48da-bb9c-5baf3d20e103","Type":"ContainerStarted","Data":"a3985c41a49bde3edaccdb819bcf48f1d5ea47f8d88958e3f02a341f779c9f44"} Mar 20 16:16:56 crc kubenswrapper[4708]: I0320 16:16:56.689146 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vzbj6" podStartSLOduration=3.597552985 podStartE2EDuration="13.689128175s" podCreationTimestamp="2026-03-20 16:16:43 +0000 UTC" firstStartedPulling="2026-03-20 16:16:45.548296209 +0000 UTC m=+960.222632924" lastFinishedPulling="2026-03-20 16:16:55.639871399 +0000 UTC m=+970.314208114" observedRunningTime="2026-03-20 16:16:56.687742308 +0000 UTC m=+971.362079033" watchObservedRunningTime="2026-03-20 16:16:56.689128175 +0000 UTC m=+971.363464890" Mar 20 16:16:56 crc kubenswrapper[4708]: I0320 16:16:56.704890 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" podStartSLOduration=2.219909919 podStartE2EDuration="15.704870356s" podCreationTimestamp="2026-03-20 16:16:41 +0000 UTC" firstStartedPulling="2026-03-20 16:16:42.663236611 +0000 UTC m=+957.337573326" lastFinishedPulling="2026-03-20 16:16:56.148197048 +0000 UTC m=+970.822533763" observedRunningTime="2026-03-20 16:16:56.703210192 +0000 UTC m=+971.377546907" watchObservedRunningTime="2026-03-20 16:16:56.704870356 +0000 UTC m=+971.379207071" Mar 20 16:16:57 crc kubenswrapper[4708]: I0320 16:16:57.667124 4708 generic.go:334] "Generic (PLEG): container finished" podID="03f3b7c8-445f-4ef1-9920-fa75d2fcd0be" containerID="91a7f33f481165eaa71dc9286fd2a24ff90ed421e6e702a9c916d3e1cbd6002c" exitCode=0 Mar 20 16:16:57 crc kubenswrapper[4708]: I0320 16:16:57.667243 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82c7x" event={"ID":"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be","Type":"ContainerDied","Data":"91a7f33f481165eaa71dc9286fd2a24ff90ed421e6e702a9c916d3e1cbd6002c"} Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.542319 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kf56m"] Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.543574 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kf56m" Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.547855 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.547900 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.551453 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kf56m"] Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.602911 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-b696g" Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.649802 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmncf\" (UniqueName: \"kubernetes.io/projected/24f329d2-189c-4e9f-84f0-0425f8d68510-kube-api-access-nmncf\") pod \"openstack-operator-index-kf56m\" (UID: \"24f329d2-189c-4e9f-84f0-0425f8d68510\") " pod="openstack-operators/openstack-operator-index-kf56m" Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.676546 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9qhs" event={"ID":"45595f80-1c03-4668-a1f4-869ff7f5bbea","Type":"ContainerStarted","Data":"fb7ffcefa4b194620669bc9230869fe6380bd5b3ca53a6177b7844fdb5d9b92b"} Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.679462 4708 generic.go:334] "Generic (PLEG): container finished" podID="03f3b7c8-445f-4ef1-9920-fa75d2fcd0be" containerID="fc4e09d19d9d339806e434c3b12ffcb2a8e04af3b68ea6516a2254a256d98768" exitCode=0 Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.679503 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82c7x" event={"ID":"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be","Type":"ContainerDied","Data":"fc4e09d19d9d339806e434c3b12ffcb2a8e04af3b68ea6516a2254a256d98768"} Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.704656 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r9qhs" podStartSLOduration=3.08999958 podStartE2EDuration="13.704636054s" podCreationTimestamp="2026-03-20 16:16:45 +0000 UTC" firstStartedPulling="2026-03-20 16:16:47.589429224 +0000 UTC m=+962.263765939" lastFinishedPulling="2026-03-20 16:16:58.204065698 +0000 UTC m=+972.878402413" observedRunningTime="2026-03-20 16:16:58.70111895 +0000 UTC m=+973.375455665" watchObservedRunningTime="2026-03-20 16:16:58.704636054 +0000 UTC m=+973.378972769" Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.752544 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmncf\" (UniqueName: \"kubernetes.io/projected/24f329d2-189c-4e9f-84f0-0425f8d68510-kube-api-access-nmncf\") pod \"openstack-operator-index-kf56m\" (UID: \"24f329d2-189c-4e9f-84f0-0425f8d68510\") " pod="openstack-operators/openstack-operator-index-kf56m" Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.769443 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmncf\" (UniqueName: \"kubernetes.io/projected/24f329d2-189c-4e9f-84f0-0425f8d68510-kube-api-access-nmncf\") pod \"openstack-operator-index-kf56m\" (UID: \"24f329d2-189c-4e9f-84f0-0425f8d68510\") " pod="openstack-operators/openstack-operator-index-kf56m" Mar 20 16:16:58 crc kubenswrapper[4708]: I0320 16:16:58.915707 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kf56m" Mar 20 16:16:59 crc kubenswrapper[4708]: I0320 16:16:59.211634 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kf56m"] Mar 20 16:16:59 crc kubenswrapper[4708]: W0320 16:16:59.216779 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24f329d2_189c_4e9f_84f0_0425f8d68510.slice/crio-1a1c6ebf0af23c7e6a34c219fb537c6a1faad442372555ec1d6a1c23d906dee9 WatchSource:0}: Error finding container 1a1c6ebf0af23c7e6a34c219fb537c6a1faad442372555ec1d6a1c23d906dee9: Status 404 returned error can't find the container with id 1a1c6ebf0af23c7e6a34c219fb537c6a1faad442372555ec1d6a1c23d906dee9 Mar 20 16:16:59 crc kubenswrapper[4708]: I0320 16:16:59.689046 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82c7x" event={"ID":"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be","Type":"ContainerStarted","Data":"788467ff8e526a042938ebb25f373832baa0c84387f735b4fff2479608b1dd75"} Mar 20 16:16:59 crc kubenswrapper[4708]: I0320 16:16:59.689138 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82c7x" event={"ID":"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be","Type":"ContainerStarted","Data":"28717102c3a412ff0fed3bb90e73609c2f0ead3b60246c2fc3a81755b519a2ad"} Mar 20 16:16:59 crc kubenswrapper[4708]: I0320 16:16:59.689148 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82c7x" event={"ID":"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be","Type":"ContainerStarted","Data":"7f0d7522c0e1789a2128242d08ec56ddc99bd2a22ed68a496d5309f32b0eba67"} Mar 20 16:16:59 crc kubenswrapper[4708]: I0320 16:16:59.689157 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82c7x" event={"ID":"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be","Type":"ContainerStarted","Data":"99258815b4df39e349a57238ecef1bcec60b41b0f49992d94c92ad67f2e83de5"} Mar 20 16:16:59 crc kubenswrapper[4708]: I0320 16:16:59.690024 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kf56m" event={"ID":"24f329d2-189c-4e9f-84f0-0425f8d68510","Type":"ContainerStarted","Data":"1a1c6ebf0af23c7e6a34c219fb537c6a1faad442372555ec1d6a1c23d906dee9"} Mar 20 16:17:00 crc kubenswrapper[4708]: I0320 16:17:00.698336 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82c7x" event={"ID":"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be","Type":"ContainerStarted","Data":"3e019633a47a37fea4d2805791fb03d5882edce4fb21813c38bc5a1a2dc5076a"} Mar 20 16:17:01 crc kubenswrapper[4708]: I0320 16:17:01.710733 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-82c7x" event={"ID":"03f3b7c8-445f-4ef1-9920-fa75d2fcd0be","Type":"ContainerStarted","Data":"be06cd75a58d1630d321b9e13906abf007404459d638fa925b13d6a5b4d6c92d"} Mar 20 16:17:01 crc kubenswrapper[4708]: I0320 16:17:01.711084 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-82c7x" Mar 20 16:17:01 crc kubenswrapper[4708]: I0320 16:17:01.735527 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-82c7x" podStartSLOduration=6.503166257 podStartE2EDuration="20.735506545s" podCreationTimestamp="2026-03-20 16:16:41 +0000 UTC" firstStartedPulling="2026-03-20 16:16:41.744409318 +0000 UTC m=+956.418746033" lastFinishedPulling="2026-03-20 16:16:55.976749606 +0000 UTC m=+970.651086321" observedRunningTime="2026-03-20 16:17:01.731600592 +0000 UTC m=+976.405937307" watchObservedRunningTime="2026-03-20 16:17:01.735506545 +0000 UTC m=+976.409843260" Mar 20 16:17:03 crc kubenswrapper[4708]: I0320 16:17:03.659798 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:17:03 crc kubenswrapper[4708]: I0320 16:17:03.659917 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:17:03 crc kubenswrapper[4708]: I0320 16:17:03.709888 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:17:03 crc kubenswrapper[4708]: I0320 16:17:03.771786 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:17:03 crc kubenswrapper[4708]: I0320 16:17:03.877276 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kf56m"] Mar 20 16:17:04 crc kubenswrapper[4708]: I0320 16:17:04.686217 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-m8lg6"] Mar 20 16:17:04 crc kubenswrapper[4708]: I0320 16:17:04.688035 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m8lg6" Mar 20 16:17:04 crc kubenswrapper[4708]: I0320 16:17:04.695355 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-m8lg6"] Mar 20 16:17:04 crc kubenswrapper[4708]: I0320 16:17:04.744649 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp9nd\" (UniqueName: \"kubernetes.io/projected/be2b9d3f-c01f-4649-9b51-2068e6f541a8-kube-api-access-qp9nd\") pod \"openstack-operator-index-m8lg6\" (UID: \"be2b9d3f-c01f-4649-9b51-2068e6f541a8\") " pod="openstack-operators/openstack-operator-index-m8lg6" Mar 20 16:17:04 crc kubenswrapper[4708]: I0320 16:17:04.845769 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp9nd\" (UniqueName: \"kubernetes.io/projected/be2b9d3f-c01f-4649-9b51-2068e6f541a8-kube-api-access-qp9nd\") pod \"openstack-operator-index-m8lg6\" (UID: \"be2b9d3f-c01f-4649-9b51-2068e6f541a8\") " pod="openstack-operators/openstack-operator-index-m8lg6" Mar 20 16:17:04 crc kubenswrapper[4708]: I0320 16:17:04.877318 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp9nd\" (UniqueName: \"kubernetes.io/projected/be2b9d3f-c01f-4649-9b51-2068e6f541a8-kube-api-access-qp9nd\") pod \"openstack-operator-index-m8lg6\" (UID: \"be2b9d3f-c01f-4649-9b51-2068e6f541a8\") " pod="openstack-operators/openstack-operator-index-m8lg6" Mar 20 16:17:05 crc kubenswrapper[4708]: I0320 16:17:05.018282 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m8lg6" Mar 20 16:17:06 crc kubenswrapper[4708]: I0320 16:17:06.121903 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:17:06 crc kubenswrapper[4708]: I0320 16:17:06.122224 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:17:06 crc kubenswrapper[4708]: I0320 16:17:06.167062 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:17:06 crc kubenswrapper[4708]: I0320 16:17:06.380058 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-82c7x" Mar 20 16:17:06 crc kubenswrapper[4708]: I0320 16:17:06.415878 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-82c7x" Mar 20 16:17:06 crc kubenswrapper[4708]: I0320 16:17:06.779507 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:17:06 crc kubenswrapper[4708]: I0320 16:17:06.885865 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6flzf"] Mar 20 16:17:06 crc kubenswrapper[4708]: I0320 16:17:06.887325 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:06 crc kubenswrapper[4708]: I0320 16:17:06.903537 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6flzf"] Mar 20 16:17:07 crc kubenswrapper[4708]: I0320 16:17:07.076062 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-catalog-content\") pod \"certified-operators-6flzf\" (UID: \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\") " pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:07 crc kubenswrapper[4708]: I0320 16:17:07.076124 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbtt6\" (UniqueName: \"kubernetes.io/projected/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-kube-api-access-lbtt6\") pod \"certified-operators-6flzf\" (UID: \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\") " pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:07 crc kubenswrapper[4708]: I0320 16:17:07.076312 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-utilities\") pod \"certified-operators-6flzf\" (UID: \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\") " pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:07 crc kubenswrapper[4708]: I0320 16:17:07.177629 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-utilities\") pod \"certified-operators-6flzf\" (UID: \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\") " pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:07 crc kubenswrapper[4708]: I0320 16:17:07.177745 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-catalog-content\") pod \"certified-operators-6flzf\" (UID: \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\") " pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:07 crc kubenswrapper[4708]: I0320 16:17:07.177775 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbtt6\" (UniqueName: \"kubernetes.io/projected/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-kube-api-access-lbtt6\") pod \"certified-operators-6flzf\" (UID: \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\") " pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:07 crc kubenswrapper[4708]: I0320 16:17:07.178205 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-utilities\") pod \"certified-operators-6flzf\" (UID: \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\") " pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:07 crc kubenswrapper[4708]: I0320 16:17:07.178247 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-catalog-content\") pod \"certified-operators-6flzf\" (UID: \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\") " pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:07 crc kubenswrapper[4708]: I0320 16:17:07.200946 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbtt6\" (UniqueName: \"kubernetes.io/projected/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-kube-api-access-lbtt6\") pod \"certified-operators-6flzf\" (UID: \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\") " pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:07 crc kubenswrapper[4708]: I0320 16:17:07.204129 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:08 crc kubenswrapper[4708]: I0320 16:17:08.281246 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vzbj6"] Mar 20 16:17:08 crc kubenswrapper[4708]: I0320 16:17:08.281788 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vzbj6" podUID="4669de98-fd7a-48da-bb9c-5baf3d20e103" containerName="registry-server" containerID="cri-o://a3985c41a49bde3edaccdb819bcf48f1d5ea47f8d88958e3f02a341f779c9f44" gracePeriod=2 Mar 20 16:17:08 crc kubenswrapper[4708]: I0320 16:17:08.771884 4708 generic.go:334] "Generic (PLEG): container finished" podID="4669de98-fd7a-48da-bb9c-5baf3d20e103" containerID="a3985c41a49bde3edaccdb819bcf48f1d5ea47f8d88958e3f02a341f779c9f44" exitCode=0 Mar 20 16:17:08 crc kubenswrapper[4708]: I0320 16:17:08.771989 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzbj6" event={"ID":"4669de98-fd7a-48da-bb9c-5baf3d20e103","Type":"ContainerDied","Data":"a3985c41a49bde3edaccdb819bcf48f1d5ea47f8d88958e3f02a341f779c9f44"} Mar 20 16:17:08 crc kubenswrapper[4708]: I0320 16:17:08.824779 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-m8lg6"] Mar 20 16:17:09 crc kubenswrapper[4708]: I0320 16:17:09.779179 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m8lg6" event={"ID":"be2b9d3f-c01f-4649-9b51-2068e6f541a8","Type":"ContainerStarted","Data":"ceef22b63c84a9c97ec7438fc3fc9a9b5babfd33498de37b832ebc0e399aa4e0"} Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.090643 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.224142 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bl4w\" (UniqueName: \"kubernetes.io/projected/4669de98-fd7a-48da-bb9c-5baf3d20e103-kube-api-access-7bl4w\") pod \"4669de98-fd7a-48da-bb9c-5baf3d20e103\" (UID: \"4669de98-fd7a-48da-bb9c-5baf3d20e103\") " Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.224192 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4669de98-fd7a-48da-bb9c-5baf3d20e103-catalog-content\") pod \"4669de98-fd7a-48da-bb9c-5baf3d20e103\" (UID: \"4669de98-fd7a-48da-bb9c-5baf3d20e103\") " Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.224416 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4669de98-fd7a-48da-bb9c-5baf3d20e103-utilities\") pod \"4669de98-fd7a-48da-bb9c-5baf3d20e103\" (UID: \"4669de98-fd7a-48da-bb9c-5baf3d20e103\") " Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.225260 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4669de98-fd7a-48da-bb9c-5baf3d20e103-utilities" (OuterVolumeSpecName: "utilities") pod "4669de98-fd7a-48da-bb9c-5baf3d20e103" (UID: "4669de98-fd7a-48da-bb9c-5baf3d20e103"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.229971 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4669de98-fd7a-48da-bb9c-5baf3d20e103-kube-api-access-7bl4w" (OuterVolumeSpecName: "kube-api-access-7bl4w") pod "4669de98-fd7a-48da-bb9c-5baf3d20e103" (UID: "4669de98-fd7a-48da-bb9c-5baf3d20e103"). InnerVolumeSpecName "kube-api-access-7bl4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.271105 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4669de98-fd7a-48da-bb9c-5baf3d20e103-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4669de98-fd7a-48da-bb9c-5baf3d20e103" (UID: "4669de98-fd7a-48da-bb9c-5baf3d20e103"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.325737 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4669de98-fd7a-48da-bb9c-5baf3d20e103-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.325802 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bl4w\" (UniqueName: \"kubernetes.io/projected/4669de98-fd7a-48da-bb9c-5baf3d20e103-kube-api-access-7bl4w\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.325819 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4669de98-fd7a-48da-bb9c-5baf3d20e103-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.437577 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6flzf"] Mar 20 16:17:10 crc kubenswrapper[4708]: W0320 16:17:10.530996 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb1ae22a_7bc4_48c5_bb1e_f4ef766b51db.slice/crio-6f4142e08563a3a04651501d832b31b4a045ad3a3b2d8cdaab66009476fd2d4d WatchSource:0}: Error finding container 6f4142e08563a3a04651501d832b31b4a045ad3a3b2d8cdaab66009476fd2d4d: Status 404 returned error can't find the container with id 6f4142e08563a3a04651501d832b31b4a045ad3a3b2d8cdaab66009476fd2d4d Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.787135 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6flzf" event={"ID":"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db","Type":"ContainerStarted","Data":"6f4142e08563a3a04651501d832b31b4a045ad3a3b2d8cdaab66009476fd2d4d"} Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.789575 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzbj6" event={"ID":"4669de98-fd7a-48da-bb9c-5baf3d20e103","Type":"ContainerDied","Data":"a260020724ab806c3bb4bf04cb55d0cf66687825aa3d0c1906c844a95229a027"} Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.789629 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzbj6" Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.789639 4708 scope.go:117] "RemoveContainer" containerID="a3985c41a49bde3edaccdb819bcf48f1d5ea47f8d88958e3f02a341f779c9f44" Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.825750 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vzbj6"] Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.826145 4708 scope.go:117] "RemoveContainer" containerID="d4df4f6ab127b8084424f430927738cd30b0f5132064e9eb131de0f0094c71b8" Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.832373 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vzbj6"] Mar 20 16:17:10 crc kubenswrapper[4708]: I0320 16:17:10.869651 4708 scope.go:117] "RemoveContainer" containerID="6183b894080b56c93a61ae1a865dc872711ce3196ca59c66afb6c6901b33c988" Mar 20 16:17:11 crc kubenswrapper[4708]: I0320 16:17:11.384514 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-82c7x" Mar 20 16:17:11 crc kubenswrapper[4708]: I0320 16:17:11.796330 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kf56m" event={"ID":"24f329d2-189c-4e9f-84f0-0425f8d68510","Type":"ContainerStarted","Data":"0f0d7f7544405e2d65581acc6924ab09f0fbb2273febe0a13e64ef0dfcc8cb73"} Mar 20 16:17:11 crc kubenswrapper[4708]: I0320 16:17:11.796398 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-kf56m" podUID="24f329d2-189c-4e9f-84f0-0425f8d68510" containerName="registry-server" containerID="cri-o://0f0d7f7544405e2d65581acc6924ab09f0fbb2273febe0a13e64ef0dfcc8cb73" gracePeriod=2 Mar 20 16:17:11 crc kubenswrapper[4708]: I0320 16:17:11.798345 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m8lg6" event={"ID":"be2b9d3f-c01f-4649-9b51-2068e6f541a8","Type":"ContainerStarted","Data":"215891de72ada9ebe3f532af4859a5129c0a5842c4e8186b17382bc250b8fa71"} Mar 20 16:17:11 crc kubenswrapper[4708]: I0320 16:17:11.801762 4708 generic.go:334] "Generic (PLEG): container finished" podID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" containerID="b83db0228b231c16db027a6b5f3ee57e6ba7ea69059a767b796f041b25057843" exitCode=0 Mar 20 16:17:11 crc kubenswrapper[4708]: I0320 16:17:11.801814 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6flzf" event={"ID":"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db","Type":"ContainerDied","Data":"b83db0228b231c16db027a6b5f3ee57e6ba7ea69059a767b796f041b25057843"} Mar 20 16:17:11 crc kubenswrapper[4708]: I0320 16:17:11.818974 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kf56m" podStartSLOduration=2.211027318 podStartE2EDuration="13.818952309s" podCreationTimestamp="2026-03-20 16:16:58 +0000 UTC" firstStartedPulling="2026-03-20 16:16:59.218601999 +0000 UTC m=+973.892938714" lastFinishedPulling="2026-03-20 16:17:10.82652699 +0000 UTC m=+985.500863705" observedRunningTime="2026-03-20 16:17:11.814464959 +0000 UTC m=+986.488801684" watchObservedRunningTime="2026-03-20 16:17:11.818952309 +0000 UTC m=+986.493289024" Mar 20 16:17:11 crc kubenswrapper[4708]: I0320 16:17:11.831123 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-m8lg6" podStartSLOduration=6.150143537 podStartE2EDuration="7.831098535s" podCreationTimestamp="2026-03-20 16:17:04 +0000 UTC" firstStartedPulling="2026-03-20 16:17:09.148649394 +0000 UTC m=+983.822986109" lastFinishedPulling="2026-03-20 16:17:10.829604392 +0000 UTC m=+985.503941107" observedRunningTime="2026-03-20 16:17:11.827602181 +0000 UTC m=+986.501938906" watchObservedRunningTime="2026-03-20 16:17:11.831098535 +0000 UTC m=+986.505435250" Mar 20 16:17:11 crc kubenswrapper[4708]: I0320 16:17:11.877226 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9qhs"] Mar 20 16:17:11 crc kubenswrapper[4708]: I0320 16:17:11.877450 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r9qhs" podUID="45595f80-1c03-4668-a1f4-869ff7f5bbea" containerName="registry-server" containerID="cri-o://fb7ffcefa4b194620669bc9230869fe6380bd5b3ca53a6177b7844fdb5d9b92b" gracePeriod=2 Mar 20 16:17:11 crc kubenswrapper[4708]: I0320 16:17:11.999804 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-b542m" Mar 20 16:17:12 crc kubenswrapper[4708]: I0320 16:17:12.251586 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4669de98-fd7a-48da-bb9c-5baf3d20e103" path="/var/lib/kubelet/pods/4669de98-fd7a-48da-bb9c-5baf3d20e103/volumes" Mar 20 16:17:12 crc kubenswrapper[4708]: I0320 16:17:12.812281 4708 generic.go:334] "Generic (PLEG): container finished" podID="24f329d2-189c-4e9f-84f0-0425f8d68510" containerID="0f0d7f7544405e2d65581acc6924ab09f0fbb2273febe0a13e64ef0dfcc8cb73" exitCode=0 Mar 20 16:17:12 crc kubenswrapper[4708]: I0320 16:17:12.812388 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kf56m" event={"ID":"24f329d2-189c-4e9f-84f0-0425f8d68510","Type":"ContainerDied","Data":"0f0d7f7544405e2d65581acc6924ab09f0fbb2273febe0a13e64ef0dfcc8cb73"} Mar 20 16:17:12 crc kubenswrapper[4708]: I0320 16:17:12.816167 4708 generic.go:334] "Generic (PLEG): container finished" podID="45595f80-1c03-4668-a1f4-869ff7f5bbea" containerID="fb7ffcefa4b194620669bc9230869fe6380bd5b3ca53a6177b7844fdb5d9b92b" exitCode=0 Mar 20 16:17:12 crc kubenswrapper[4708]: I0320 16:17:12.816207 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9qhs" event={"ID":"45595f80-1c03-4668-a1f4-869ff7f5bbea","Type":"ContainerDied","Data":"fb7ffcefa4b194620669bc9230869fe6380bd5b3ca53a6177b7844fdb5d9b92b"} Mar 20 16:17:13 crc kubenswrapper[4708]: I0320 16:17:13.827837 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9qhs" event={"ID":"45595f80-1c03-4668-a1f4-869ff7f5bbea","Type":"ContainerDied","Data":"91bc7668387c9ee52cb11df8089728983e227115777a83e5f9be971ddf2e9a41"} Mar 20 16:17:13 crc kubenswrapper[4708]: I0320 16:17:13.827877 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91bc7668387c9ee52cb11df8089728983e227115777a83e5f9be971ddf2e9a41" Mar 20 16:17:13 crc kubenswrapper[4708]: I0320 16:17:13.880699 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:17:13 crc kubenswrapper[4708]: I0320 16:17:13.983198 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zrlk\" (UniqueName: \"kubernetes.io/projected/45595f80-1c03-4668-a1f4-869ff7f5bbea-kube-api-access-9zrlk\") pod \"45595f80-1c03-4668-a1f4-869ff7f5bbea\" (UID: \"45595f80-1c03-4668-a1f4-869ff7f5bbea\") " Mar 20 16:17:13 crc kubenswrapper[4708]: I0320 16:17:13.983292 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45595f80-1c03-4668-a1f4-869ff7f5bbea-catalog-content\") pod \"45595f80-1c03-4668-a1f4-869ff7f5bbea\" (UID: \"45595f80-1c03-4668-a1f4-869ff7f5bbea\") " Mar 20 16:17:13 crc kubenswrapper[4708]: I0320 16:17:13.983336 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45595f80-1c03-4668-a1f4-869ff7f5bbea-utilities\") pod \"45595f80-1c03-4668-a1f4-869ff7f5bbea\" (UID: \"45595f80-1c03-4668-a1f4-869ff7f5bbea\") " Mar 20 16:17:13 crc kubenswrapper[4708]: I0320 16:17:13.984685 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45595f80-1c03-4668-a1f4-869ff7f5bbea-utilities" (OuterVolumeSpecName: "utilities") pod "45595f80-1c03-4668-a1f4-869ff7f5bbea" (UID: "45595f80-1c03-4668-a1f4-869ff7f5bbea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:17:13 crc kubenswrapper[4708]: I0320 16:17:13.993019 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45595f80-1c03-4668-a1f4-869ff7f5bbea-kube-api-access-9zrlk" (OuterVolumeSpecName: "kube-api-access-9zrlk") pod "45595f80-1c03-4668-a1f4-869ff7f5bbea" (UID: "45595f80-1c03-4668-a1f4-869ff7f5bbea"). InnerVolumeSpecName "kube-api-access-9zrlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.015695 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45595f80-1c03-4668-a1f4-869ff7f5bbea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45595f80-1c03-4668-a1f4-869ff7f5bbea" (UID: "45595f80-1c03-4668-a1f4-869ff7f5bbea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.059234 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kf56m" Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.086895 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zrlk\" (UniqueName: \"kubernetes.io/projected/45595f80-1c03-4668-a1f4-869ff7f5bbea-kube-api-access-9zrlk\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.087045 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45595f80-1c03-4668-a1f4-869ff7f5bbea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.087059 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45595f80-1c03-4668-a1f4-869ff7f5bbea-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.188656 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmncf\" (UniqueName: \"kubernetes.io/projected/24f329d2-189c-4e9f-84f0-0425f8d68510-kube-api-access-nmncf\") pod \"24f329d2-189c-4e9f-84f0-0425f8d68510\" (UID: \"24f329d2-189c-4e9f-84f0-0425f8d68510\") " Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.195660 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f329d2-189c-4e9f-84f0-0425f8d68510-kube-api-access-nmncf" (OuterVolumeSpecName: "kube-api-access-nmncf") pod "24f329d2-189c-4e9f-84f0-0425f8d68510" (UID: "24f329d2-189c-4e9f-84f0-0425f8d68510"). InnerVolumeSpecName "kube-api-access-nmncf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.290014 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmncf\" (UniqueName: \"kubernetes.io/projected/24f329d2-189c-4e9f-84f0-0425f8d68510-kube-api-access-nmncf\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.837628 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kf56m" event={"ID":"24f329d2-189c-4e9f-84f0-0425f8d68510","Type":"ContainerDied","Data":"1a1c6ebf0af23c7e6a34c219fb537c6a1faad442372555ec1d6a1c23d906dee9"} Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.837715 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kf56m" Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.838106 4708 scope.go:117] "RemoveContainer" containerID="0f0d7f7544405e2d65581acc6924ab09f0fbb2273febe0a13e64ef0dfcc8cb73" Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.839443 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6flzf" event={"ID":"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db","Type":"ContainerStarted","Data":"0f41af8946c4445d172f115b93405ac3d703525a5a71e389685e8596b4dee793"} Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.839476 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9qhs" Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.928374 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9qhs"] Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.941511 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9qhs"] Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.947561 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kf56m"] Mar 20 16:17:14 crc kubenswrapper[4708]: I0320 16:17:14.957561 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-kf56m"] Mar 20 16:17:15 crc kubenswrapper[4708]: I0320 16:17:15.018587 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-m8lg6" Mar 20 16:17:15 crc kubenswrapper[4708]: I0320 16:17:15.018636 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-m8lg6" Mar 20 16:17:15 crc kubenswrapper[4708]: I0320 16:17:15.045982 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-m8lg6" Mar 20 16:17:15 crc kubenswrapper[4708]: I0320 16:17:15.847827 4708 generic.go:334] "Generic (PLEG): container finished" podID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" containerID="0f41af8946c4445d172f115b93405ac3d703525a5a71e389685e8596b4dee793" exitCode=0 Mar 20 16:17:15 crc kubenswrapper[4708]: I0320 16:17:15.847904 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6flzf" event={"ID":"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db","Type":"ContainerDied","Data":"0f41af8946c4445d172f115b93405ac3d703525a5a71e389685e8596b4dee793"} Mar 20 16:17:15 crc kubenswrapper[4708]: I0320 16:17:15.880172 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-m8lg6" Mar 20 16:17:16 crc kubenswrapper[4708]: I0320 16:17:16.120821 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f329d2-189c-4e9f-84f0-0425f8d68510" path="/var/lib/kubelet/pods/24f329d2-189c-4e9f-84f0-0425f8d68510/volumes" Mar 20 16:17:16 crc kubenswrapper[4708]: I0320 16:17:16.121691 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45595f80-1c03-4668-a1f4-869ff7f5bbea" path="/var/lib/kubelet/pods/45595f80-1c03-4668-a1f4-869ff7f5bbea/volumes" Mar 20 16:17:16 crc kubenswrapper[4708]: I0320 16:17:16.858704 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6flzf" event={"ID":"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db","Type":"ContainerStarted","Data":"3bbce2ccb995982f2e5a74961e4ce0805c89f16dc1065d4426a558b11cf61bb3"} Mar 20 16:17:16 crc kubenswrapper[4708]: I0320 16:17:16.881137 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6flzf" podStartSLOduration=6.300240138 podStartE2EDuration="10.881117913s" podCreationTimestamp="2026-03-20 16:17:06 +0000 UTC" firstStartedPulling="2026-03-20 16:17:11.803725691 +0000 UTC m=+986.478062406" lastFinishedPulling="2026-03-20 16:17:16.384603466 +0000 UTC m=+991.058940181" observedRunningTime="2026-03-20 16:17:16.877319332 +0000 UTC m=+991.551656057" watchObservedRunningTime="2026-03-20 16:17:16.881117913 +0000 UTC m=+991.555454628" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.205383 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.205997 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.929631 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn"] Mar 20 16:17:17 crc kubenswrapper[4708]: E0320 16:17:17.931166 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45595f80-1c03-4668-a1f4-869ff7f5bbea" containerName="registry-server" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.931265 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="45595f80-1c03-4668-a1f4-869ff7f5bbea" containerName="registry-server" Mar 20 16:17:17 crc kubenswrapper[4708]: E0320 16:17:17.931333 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45595f80-1c03-4668-a1f4-869ff7f5bbea" containerName="extract-content" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.931392 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="45595f80-1c03-4668-a1f4-869ff7f5bbea" containerName="extract-content" Mar 20 16:17:17 crc kubenswrapper[4708]: E0320 16:17:17.931461 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f329d2-189c-4e9f-84f0-0425f8d68510" containerName="registry-server" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.931537 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f329d2-189c-4e9f-84f0-0425f8d68510" containerName="registry-server" Mar 20 16:17:17 crc kubenswrapper[4708]: E0320 16:17:17.931599 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4669de98-fd7a-48da-bb9c-5baf3d20e103" containerName="registry-server" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.931681 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="4669de98-fd7a-48da-bb9c-5baf3d20e103" containerName="registry-server" Mar 20 16:17:17 crc kubenswrapper[4708]: E0320 16:17:17.931790 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4669de98-fd7a-48da-bb9c-5baf3d20e103" containerName="extract-utilities" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.931859 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="4669de98-fd7a-48da-bb9c-5baf3d20e103" containerName="extract-utilities" Mar 20 16:17:17 crc kubenswrapper[4708]: E0320 16:17:17.931918 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45595f80-1c03-4668-a1f4-869ff7f5bbea" containerName="extract-utilities" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.931978 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="45595f80-1c03-4668-a1f4-869ff7f5bbea" containerName="extract-utilities" Mar 20 16:17:17 crc kubenswrapper[4708]: E0320 16:17:17.932076 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4669de98-fd7a-48da-bb9c-5baf3d20e103" containerName="extract-content" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.932131 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="4669de98-fd7a-48da-bb9c-5baf3d20e103" containerName="extract-content" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.932414 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f329d2-189c-4e9f-84f0-0425f8d68510" containerName="registry-server" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.932479 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="4669de98-fd7a-48da-bb9c-5baf3d20e103" containerName="registry-server" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.932535 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="45595f80-1c03-4668-a1f4-869ff7f5bbea" containerName="registry-server" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.934143 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.936489 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9txbp" Mar 20 16:17:17 crc kubenswrapper[4708]: I0320 16:17:17.937444 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn"] Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.042179 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b67c9\" (UniqueName: \"kubernetes.io/projected/aeecd507-ad55-4ceb-994a-1431f6d686c6-kube-api-access-b67c9\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn\" (UID: \"aeecd507-ad55-4ceb-994a-1431f6d686c6\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.042242 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aeecd507-ad55-4ceb-994a-1431f6d686c6-bundle\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn\" (UID: \"aeecd507-ad55-4ceb-994a-1431f6d686c6\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.042268 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aeecd507-ad55-4ceb-994a-1431f6d686c6-util\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn\" (UID: \"aeecd507-ad55-4ceb-994a-1431f6d686c6\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.144386 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b67c9\" (UniqueName: \"kubernetes.io/projected/aeecd507-ad55-4ceb-994a-1431f6d686c6-kube-api-access-b67c9\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn\" (UID: \"aeecd507-ad55-4ceb-994a-1431f6d686c6\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.144458 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aeecd507-ad55-4ceb-994a-1431f6d686c6-bundle\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn\" (UID: \"aeecd507-ad55-4ceb-994a-1431f6d686c6\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.144484 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aeecd507-ad55-4ceb-994a-1431f6d686c6-util\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn\" (UID: \"aeecd507-ad55-4ceb-994a-1431f6d686c6\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.144962 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aeecd507-ad55-4ceb-994a-1431f6d686c6-bundle\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn\" (UID: \"aeecd507-ad55-4ceb-994a-1431f6d686c6\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.144995 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aeecd507-ad55-4ceb-994a-1431f6d686c6-util\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn\" (UID: \"aeecd507-ad55-4ceb-994a-1431f6d686c6\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.165591 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b67c9\" (UniqueName: \"kubernetes.io/projected/aeecd507-ad55-4ceb-994a-1431f6d686c6-kube-api-access-b67c9\") pod \"d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn\" (UID: \"aeecd507-ad55-4ceb-994a-1431f6d686c6\") " pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.258213 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6flzf" podUID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" containerName="registry-server" probeResult="failure" output=< Mar 20 16:17:18 crc kubenswrapper[4708]: timeout: failed to connect service ":50051" within 1s Mar 20 16:17:18 crc kubenswrapper[4708]: > Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.272782 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.686492 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn"] Mar 20 16:17:18 crc kubenswrapper[4708]: I0320 16:17:18.872540 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" event={"ID":"aeecd507-ad55-4ceb-994a-1431f6d686c6","Type":"ContainerStarted","Data":"fc6dfba61b47a1f54138c3016ead3506eadd53fb4042d01f84ad59156c28d1a5"} Mar 20 16:17:19 crc kubenswrapper[4708]: I0320 16:17:19.880236 4708 generic.go:334] "Generic (PLEG): container finished" podID="aeecd507-ad55-4ceb-994a-1431f6d686c6" containerID="aa23f5abdf95738306de5d21273c21fb765c070c2d1fe0a4a0f1fa86ed79c17b" exitCode=0 Mar 20 16:17:19 crc kubenswrapper[4708]: I0320 16:17:19.880289 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" event={"ID":"aeecd507-ad55-4ceb-994a-1431f6d686c6","Type":"ContainerDied","Data":"aa23f5abdf95738306de5d21273c21fb765c070c2d1fe0a4a0f1fa86ed79c17b"} Mar 20 16:17:23 crc kubenswrapper[4708]: I0320 16:17:23.907840 4708 generic.go:334] "Generic (PLEG): container finished" podID="aeecd507-ad55-4ceb-994a-1431f6d686c6" containerID="89a687e7ba3211dae769a66aeabf3e0381687c586bfe93aff8b167ce351816d7" exitCode=0 Mar 20 16:17:23 crc kubenswrapper[4708]: I0320 16:17:23.907935 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" event={"ID":"aeecd507-ad55-4ceb-994a-1431f6d686c6","Type":"ContainerDied","Data":"89a687e7ba3211dae769a66aeabf3e0381687c586bfe93aff8b167ce351816d7"} Mar 20 16:17:24 crc kubenswrapper[4708]: I0320 16:17:24.918874 4708 generic.go:334] "Generic (PLEG): container finished" podID="aeecd507-ad55-4ceb-994a-1431f6d686c6" containerID="a54be629b36872996d3758e8e862c7c1c88ea963e0059b4c168c8508f1fc5c66" exitCode=0 Mar 20 16:17:24 crc kubenswrapper[4708]: I0320 16:17:24.918919 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" event={"ID":"aeecd507-ad55-4ceb-994a-1431f6d686c6","Type":"ContainerDied","Data":"a54be629b36872996d3758e8e862c7c1c88ea963e0059b4c168c8508f1fc5c66"} Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.165927 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.178651 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.178742 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.267908 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b67c9\" (UniqueName: \"kubernetes.io/projected/aeecd507-ad55-4ceb-994a-1431f6d686c6-kube-api-access-b67c9\") pod \"aeecd507-ad55-4ceb-994a-1431f6d686c6\" (UID: \"aeecd507-ad55-4ceb-994a-1431f6d686c6\") " Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.267971 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aeecd507-ad55-4ceb-994a-1431f6d686c6-util\") pod \"aeecd507-ad55-4ceb-994a-1431f6d686c6\" (UID: \"aeecd507-ad55-4ceb-994a-1431f6d686c6\") " Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.268119 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aeecd507-ad55-4ceb-994a-1431f6d686c6-bundle\") pod \"aeecd507-ad55-4ceb-994a-1431f6d686c6\" (UID: \"aeecd507-ad55-4ceb-994a-1431f6d686c6\") " Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.269052 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeecd507-ad55-4ceb-994a-1431f6d686c6-bundle" (OuterVolumeSpecName: "bundle") pod "aeecd507-ad55-4ceb-994a-1431f6d686c6" (UID: "aeecd507-ad55-4ceb-994a-1431f6d686c6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.273695 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeecd507-ad55-4ceb-994a-1431f6d686c6-kube-api-access-b67c9" (OuterVolumeSpecName: "kube-api-access-b67c9") pod "aeecd507-ad55-4ceb-994a-1431f6d686c6" (UID: "aeecd507-ad55-4ceb-994a-1431f6d686c6"). InnerVolumeSpecName "kube-api-access-b67c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.278531 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeecd507-ad55-4ceb-994a-1431f6d686c6-util" (OuterVolumeSpecName: "util") pod "aeecd507-ad55-4ceb-994a-1431f6d686c6" (UID: "aeecd507-ad55-4ceb-994a-1431f6d686c6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.369238 4708 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aeecd507-ad55-4ceb-994a-1431f6d686c6-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.369270 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b67c9\" (UniqueName: \"kubernetes.io/projected/aeecd507-ad55-4ceb-994a-1431f6d686c6-kube-api-access-b67c9\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.369280 4708 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aeecd507-ad55-4ceb-994a-1431f6d686c6-util\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.934312 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" event={"ID":"aeecd507-ad55-4ceb-994a-1431f6d686c6","Type":"ContainerDied","Data":"fc6dfba61b47a1f54138c3016ead3506eadd53fb4042d01f84ad59156c28d1a5"} Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.934628 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc6dfba61b47a1f54138c3016ead3506eadd53fb4042d01f84ad59156c28d1a5" Mar 20 16:17:26 crc kubenswrapper[4708]: I0320 16:17:26.934427 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn" Mar 20 16:17:27 crc kubenswrapper[4708]: I0320 16:17:27.265548 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:27 crc kubenswrapper[4708]: I0320 16:17:27.312956 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.676623 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6flzf"] Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.677382 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6flzf" podUID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" containerName="registry-server" containerID="cri-o://3bbce2ccb995982f2e5a74961e4ce0805c89f16dc1065d4426a558b11cf61bb3" gracePeriod=2 Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.728540 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-94465cd74-j22xf"] Mar 20 16:17:29 crc kubenswrapper[4708]: E0320 16:17:29.728800 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeecd507-ad55-4ceb-994a-1431f6d686c6" containerName="extract" Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.728815 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeecd507-ad55-4ceb-994a-1431f6d686c6" containerName="extract" Mar 20 16:17:29 crc kubenswrapper[4708]: E0320 16:17:29.728841 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeecd507-ad55-4ceb-994a-1431f6d686c6" containerName="pull" Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.728848 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeecd507-ad55-4ceb-994a-1431f6d686c6" containerName="pull" Mar 20 16:17:29 crc kubenswrapper[4708]: E0320 16:17:29.728862 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeecd507-ad55-4ceb-994a-1431f6d686c6" containerName="util" Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.728868 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeecd507-ad55-4ceb-994a-1431f6d686c6" containerName="util" Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.729002 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeecd507-ad55-4ceb-994a-1431f6d686c6" containerName="extract" Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.729530 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-94465cd74-j22xf" Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.733869 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-868xt" Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.747631 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-94465cd74-j22xf"] Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.929029 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqbb9\" (UniqueName: \"kubernetes.io/projected/5c3f787c-d4c6-49c8-99a2-c9f3ae02738a-kube-api-access-fqbb9\") pod \"openstack-operator-controller-init-94465cd74-j22xf\" (UID: \"5c3f787c-d4c6-49c8-99a2-c9f3ae02738a\") " pod="openstack-operators/openstack-operator-controller-init-94465cd74-j22xf" Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.961082 4708 generic.go:334] "Generic (PLEG): container finished" podID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" containerID="3bbce2ccb995982f2e5a74961e4ce0805c89f16dc1065d4426a558b11cf61bb3" exitCode=0 Mar 20 16:17:29 crc kubenswrapper[4708]: I0320 16:17:29.961149 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6flzf" event={"ID":"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db","Type":"ContainerDied","Data":"3bbce2ccb995982f2e5a74961e4ce0805c89f16dc1065d4426a558b11cf61bb3"} Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.030460 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqbb9\" (UniqueName: \"kubernetes.io/projected/5c3f787c-d4c6-49c8-99a2-c9f3ae02738a-kube-api-access-fqbb9\") pod \"openstack-operator-controller-init-94465cd74-j22xf\" (UID: \"5c3f787c-d4c6-49c8-99a2-c9f3ae02738a\") " pod="openstack-operators/openstack-operator-controller-init-94465cd74-j22xf" Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.051329 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqbb9\" (UniqueName: \"kubernetes.io/projected/5c3f787c-d4c6-49c8-99a2-c9f3ae02738a-kube-api-access-fqbb9\") pod \"openstack-operator-controller-init-94465cd74-j22xf\" (UID: \"5c3f787c-d4c6-49c8-99a2-c9f3ae02738a\") " pod="openstack-operators/openstack-operator-controller-init-94465cd74-j22xf" Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.350483 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-94465cd74-j22xf" Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.551310 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.609421 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-94465cd74-j22xf"] Mar 20 16:17:30 crc kubenswrapper[4708]: W0320 16:17:30.615025 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c3f787c_d4c6_49c8_99a2_c9f3ae02738a.slice/crio-bf2d277a00e6a10c0a32a3e52666784173a926133204f3aa87353fd9ca3b68d7 WatchSource:0}: Error finding container bf2d277a00e6a10c0a32a3e52666784173a926133204f3aa87353fd9ca3b68d7: Status 404 returned error can't find the container with id bf2d277a00e6a10c0a32a3e52666784173a926133204f3aa87353fd9ca3b68d7 Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.745921 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbtt6\" (UniqueName: \"kubernetes.io/projected/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-kube-api-access-lbtt6\") pod \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\" (UID: \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\") " Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.746019 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-utilities\") pod \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\" (UID: \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\") " Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.746050 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-catalog-content\") pod \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\" (UID: \"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db\") " Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.746935 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-utilities" (OuterVolumeSpecName: "utilities") pod "cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" (UID: "cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.750879 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-kube-api-access-lbtt6" (OuterVolumeSpecName: "kube-api-access-lbtt6") pod "cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" (UID: "cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db"). InnerVolumeSpecName "kube-api-access-lbtt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.794877 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" (UID: "cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.847854 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbtt6\" (UniqueName: \"kubernetes.io/projected/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-kube-api-access-lbtt6\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.847891 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.847900 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.970528 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6flzf" event={"ID":"cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db","Type":"ContainerDied","Data":"6f4142e08563a3a04651501d832b31b4a045ad3a3b2d8cdaab66009476fd2d4d"} Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.970568 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6flzf" Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.970904 4708 scope.go:117] "RemoveContainer" containerID="3bbce2ccb995982f2e5a74961e4ce0805c89f16dc1065d4426a558b11cf61bb3" Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.971770 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-94465cd74-j22xf" event={"ID":"5c3f787c-d4c6-49c8-99a2-c9f3ae02738a","Type":"ContainerStarted","Data":"bf2d277a00e6a10c0a32a3e52666784173a926133204f3aa87353fd9ca3b68d7"} Mar 20 16:17:30 crc kubenswrapper[4708]: I0320 16:17:30.989060 4708 scope.go:117] "RemoveContainer" containerID="0f41af8946c4445d172f115b93405ac3d703525a5a71e389685e8596b4dee793" Mar 20 16:17:31 crc kubenswrapper[4708]: I0320 16:17:31.002370 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6flzf"] Mar 20 16:17:31 crc kubenswrapper[4708]: I0320 16:17:31.016004 4708 scope.go:117] "RemoveContainer" containerID="b83db0228b231c16db027a6b5f3ee57e6ba7ea69059a767b796f041b25057843" Mar 20 16:17:31 crc kubenswrapper[4708]: I0320 16:17:31.019621 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6flzf"] Mar 20 16:17:32 crc kubenswrapper[4708]: I0320 16:17:32.123144 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" path="/var/lib/kubelet/pods/cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db/volumes" Mar 20 16:17:39 crc kubenswrapper[4708]: I0320 16:17:39.024780 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-94465cd74-j22xf" event={"ID":"5c3f787c-d4c6-49c8-99a2-c9f3ae02738a","Type":"ContainerStarted","Data":"a1074616d8b91929bd880c3baa2edd057d6bc171b7815e7bb66deb9df2f649d7"} Mar 20 16:17:39 crc kubenswrapper[4708]: I0320 16:17:39.025486 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-94465cd74-j22xf" Mar 20 16:17:39 crc kubenswrapper[4708]: I0320 16:17:39.055030 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-94465cd74-j22xf" podStartSLOduration=2.43949785 podStartE2EDuration="10.055008398s" podCreationTimestamp="2026-03-20 16:17:29 +0000 UTC" firstStartedPulling="2026-03-20 16:17:30.620109696 +0000 UTC m=+1005.294446411" lastFinishedPulling="2026-03-20 16:17:38.235620244 +0000 UTC m=+1012.909956959" observedRunningTime="2026-03-20 16:17:39.050684672 +0000 UTC m=+1013.725021407" watchObservedRunningTime="2026-03-20 16:17:39.055008398 +0000 UTC m=+1013.729345113" Mar 20 16:17:50 crc kubenswrapper[4708]: I0320 16:17:50.354219 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-94465cd74-j22xf" Mar 20 16:17:56 crc kubenswrapper[4708]: I0320 16:17:56.178650 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:17:56 crc kubenswrapper[4708]: I0320 16:17:56.179100 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:17:56 crc kubenswrapper[4708]: I0320 16:17:56.179247 4708 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:17:56 crc kubenswrapper[4708]: I0320 16:17:56.180019 4708 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b96515fa4390968ae59bbefc52b318ed0531df03cbc4138681a6e2acfc16a612"} pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:17:56 crc kubenswrapper[4708]: I0320 16:17:56.180086 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" containerID="cri-o://b96515fa4390968ae59bbefc52b318ed0531df03cbc4138681a6e2acfc16a612" gracePeriod=600 Mar 20 16:17:57 crc kubenswrapper[4708]: I0320 16:17:57.159057 4708 generic.go:334] "Generic (PLEG): container finished" podID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerID="b96515fa4390968ae59bbefc52b318ed0531df03cbc4138681a6e2acfc16a612" exitCode=0 Mar 20 16:17:57 crc kubenswrapper[4708]: I0320 16:17:57.159636 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerDied","Data":"b96515fa4390968ae59bbefc52b318ed0531df03cbc4138681a6e2acfc16a612"} Mar 20 16:17:57 crc kubenswrapper[4708]: I0320 16:17:57.159686 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerStarted","Data":"a0d736f2e0fcb77d6a0bd7e7c40db6605c442763cf48cb3f0e13e50d606ae696"} Mar 20 16:17:57 crc kubenswrapper[4708]: I0320 16:17:57.159706 4708 scope.go:117] "RemoveContainer" containerID="d5c914e606937f3e664e2e028c8e0d408cfdd520b7dcf21b2c2c87ba6f561a9d" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.129607 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567058-s2vqj"] Mar 20 16:18:00 crc kubenswrapper[4708]: E0320 16:18:00.130134 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" containerName="extract-content" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.130148 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" containerName="extract-content" Mar 20 16:18:00 crc kubenswrapper[4708]: E0320 16:18:00.130160 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" containerName="registry-server" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.130166 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" containerName="registry-server" Mar 20 16:18:00 crc kubenswrapper[4708]: E0320 16:18:00.130182 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" containerName="extract-utilities" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.130187 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" containerName="extract-utilities" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.130291 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb1ae22a-7bc4-48c5-bb1e-f4ef766b51db" containerName="registry-server" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.130722 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-s2vqj" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.132811 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.132815 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.133177 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.138040 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-s2vqj"] Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.155204 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4bx7\" (UniqueName: \"kubernetes.io/projected/ffb520a6-2a16-4d11-a67c-86ac48535b9a-kube-api-access-w4bx7\") pod \"auto-csr-approver-29567058-s2vqj\" (UID: \"ffb520a6-2a16-4d11-a67c-86ac48535b9a\") " pod="openshift-infra/auto-csr-approver-29567058-s2vqj" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.256050 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4bx7\" (UniqueName: \"kubernetes.io/projected/ffb520a6-2a16-4d11-a67c-86ac48535b9a-kube-api-access-w4bx7\") pod \"auto-csr-approver-29567058-s2vqj\" (UID: \"ffb520a6-2a16-4d11-a67c-86ac48535b9a\") " pod="openshift-infra/auto-csr-approver-29567058-s2vqj" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.274074 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4bx7\" (UniqueName: \"kubernetes.io/projected/ffb520a6-2a16-4d11-a67c-86ac48535b9a-kube-api-access-w4bx7\") pod \"auto-csr-approver-29567058-s2vqj\" (UID: \"ffb520a6-2a16-4d11-a67c-86ac48535b9a\") " pod="openshift-infra/auto-csr-approver-29567058-s2vqj" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.447658 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-s2vqj" Mar 20 16:18:00 crc kubenswrapper[4708]: I0320 16:18:00.848112 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-s2vqj"] Mar 20 16:18:00 crc kubenswrapper[4708]: W0320 16:18:00.857893 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffb520a6_2a16_4d11_a67c_86ac48535b9a.slice/crio-07ca6a1a6dbb9e92b3f9a7cac5656616830ff29d63c71a0776b8d8a3a59b7898 WatchSource:0}: Error finding container 07ca6a1a6dbb9e92b3f9a7cac5656616830ff29d63c71a0776b8d8a3a59b7898: Status 404 returned error can't find the container with id 07ca6a1a6dbb9e92b3f9a7cac5656616830ff29d63c71a0776b8d8a3a59b7898 Mar 20 16:18:01 crc kubenswrapper[4708]: I0320 16:18:01.184004 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567058-s2vqj" event={"ID":"ffb520a6-2a16-4d11-a67c-86ac48535b9a","Type":"ContainerStarted","Data":"07ca6a1a6dbb9e92b3f9a7cac5656616830ff29d63c71a0776b8d8a3a59b7898"} Mar 20 16:18:03 crc kubenswrapper[4708]: I0320 16:18:03.203956 4708 generic.go:334] "Generic (PLEG): container finished" podID="ffb520a6-2a16-4d11-a67c-86ac48535b9a" containerID="d92bd51b48fc4d2e903cf8ff8703ab1c2c6b5997c2e523091504d71742c0e445" exitCode=0 Mar 20 16:18:03 crc kubenswrapper[4708]: I0320 16:18:03.204060 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567058-s2vqj" event={"ID":"ffb520a6-2a16-4d11-a67c-86ac48535b9a","Type":"ContainerDied","Data":"d92bd51b48fc4d2e903cf8ff8703ab1c2c6b5997c2e523091504d71742c0e445"} Mar 20 16:18:04 crc kubenswrapper[4708]: I0320 16:18:04.488842 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-s2vqj" Mar 20 16:18:04 crc kubenswrapper[4708]: I0320 16:18:04.610431 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4bx7\" (UniqueName: \"kubernetes.io/projected/ffb520a6-2a16-4d11-a67c-86ac48535b9a-kube-api-access-w4bx7\") pod \"ffb520a6-2a16-4d11-a67c-86ac48535b9a\" (UID: \"ffb520a6-2a16-4d11-a67c-86ac48535b9a\") " Mar 20 16:18:04 crc kubenswrapper[4708]: I0320 16:18:04.634371 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffb520a6-2a16-4d11-a67c-86ac48535b9a-kube-api-access-w4bx7" (OuterVolumeSpecName: "kube-api-access-w4bx7") pod "ffb520a6-2a16-4d11-a67c-86ac48535b9a" (UID: "ffb520a6-2a16-4d11-a67c-86ac48535b9a"). InnerVolumeSpecName "kube-api-access-w4bx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:18:04 crc kubenswrapper[4708]: I0320 16:18:04.712493 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4bx7\" (UniqueName: \"kubernetes.io/projected/ffb520a6-2a16-4d11-a67c-86ac48535b9a-kube-api-access-w4bx7\") on node \"crc\" DevicePath \"\"" Mar 20 16:18:05 crc kubenswrapper[4708]: I0320 16:18:05.217289 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567058-s2vqj" event={"ID":"ffb520a6-2a16-4d11-a67c-86ac48535b9a","Type":"ContainerDied","Data":"07ca6a1a6dbb9e92b3f9a7cac5656616830ff29d63c71a0776b8d8a3a59b7898"} Mar 20 16:18:05 crc kubenswrapper[4708]: I0320 16:18:05.217333 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07ca6a1a6dbb9e92b3f9a7cac5656616830ff29d63c71a0776b8d8a3a59b7898" Mar 20 16:18:05 crc kubenswrapper[4708]: I0320 16:18:05.217482 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-s2vqj" Mar 20 16:18:05 crc kubenswrapper[4708]: I0320 16:18:05.540338 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-hg5th"] Mar 20 16:18:05 crc kubenswrapper[4708]: I0320 16:18:05.544455 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-hg5th"] Mar 20 16:18:06 crc kubenswrapper[4708]: I0320 16:18:06.120738 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba33465-4320-42d8-9ac4-eff3f7d5bca8" path="/var/lib/kubelet/pods/6ba33465-4320-42d8-9ac4-eff3f7d5bca8/volumes" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.188370 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h"] Mar 20 16:18:08 crc kubenswrapper[4708]: E0320 16:18:08.189704 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffb520a6-2a16-4d11-a67c-86ac48535b9a" containerName="oc" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.189798 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffb520a6-2a16-4d11-a67c-86ac48535b9a" containerName="oc" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.189971 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffb520a6-2a16-4d11-a67c-86ac48535b9a" containerName="oc" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.190453 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.193970 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5cc72" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.212594 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.214283 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.215037 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.240557 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qrjbp" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.264348 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.266713 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bxrz\" (UniqueName: \"kubernetes.io/projected/c39a1357-1bbd-4bad-b18d-db9e7eacad56-kube-api-access-2bxrz\") pod \"barbican-operator-controller-manager-59bc569d95-xr46h\" (UID: \"c39a1357-1bbd-4bad-b18d-db9e7eacad56\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.266781 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xt28\" (UniqueName: \"kubernetes.io/projected/d19b5a0c-7421-4454-9c5a-d2bf4828901a-kube-api-access-5xt28\") pod \"cinder-operator-controller-manager-8d58dc466-sl77r\" (UID: \"d19b5a0c-7421-4454-9c5a-d2bf4828901a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.284042 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.284899 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.290543 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-mfk59" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.302206 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.303003 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" Mar 20 16:18:08 crc kubenswrapper[4708]: W0320 16:18:08.317077 4708 reflector.go:561] object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-c5gz8": failed to list *v1.Secret: secrets "glance-operator-controller-manager-dockercfg-c5gz8" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Mar 20 16:18:08 crc kubenswrapper[4708]: E0320 16:18:08.317128 4708 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"glance-operator-controller-manager-dockercfg-c5gz8\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"glance-operator-controller-manager-dockercfg-c5gz8\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.336801 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.390000 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.390478 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bxrz\" (UniqueName: \"kubernetes.io/projected/c39a1357-1bbd-4bad-b18d-db9e7eacad56-kube-api-access-2bxrz\") pod \"barbican-operator-controller-manager-59bc569d95-xr46h\" (UID: \"c39a1357-1bbd-4bad-b18d-db9e7eacad56\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.390827 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdxgf\" (UniqueName: \"kubernetes.io/projected/19a022dc-93c1-4899-837f-4fd78793d13d-kube-api-access-jdxgf\") pod \"designate-operator-controller-manager-588d4d986b-fznb9\" (UID: \"19a022dc-93c1-4899-837f-4fd78793d13d\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.390870 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xt28\" (UniqueName: \"kubernetes.io/projected/d19b5a0c-7421-4454-9c5a-d2bf4828901a-kube-api-access-5xt28\") pod \"cinder-operator-controller-manager-8d58dc466-sl77r\" (UID: \"d19b5a0c-7421-4454-9c5a-d2bf4828901a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.390929 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72qtd\" (UniqueName: \"kubernetes.io/projected/0a721b9a-b944-45de-8eed-3f181e09f6cf-kube-api-access-72qtd\") pod \"glance-operator-controller-manager-79df6bcc97-nlvdk\" (UID: \"0a721b9a-b944-45de-8eed-3f181e09f6cf\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.400079 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.401129 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.424536 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wntlv" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.424998 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xt28\" (UniqueName: \"kubernetes.io/projected/d19b5a0c-7421-4454-9c5a-d2bf4828901a-kube-api-access-5xt28\") pod \"cinder-operator-controller-manager-8d58dc466-sl77r\" (UID: \"d19b5a0c-7421-4454-9c5a-d2bf4828901a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.426829 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bxrz\" (UniqueName: \"kubernetes.io/projected/c39a1357-1bbd-4bad-b18d-db9e7eacad56-kube-api-access-2bxrz\") pod \"barbican-operator-controller-manager-59bc569d95-xr46h\" (UID: \"c39a1357-1bbd-4bad-b18d-db9e7eacad56\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.433392 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.434323 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.441513 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.451323 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-6625x" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.476712 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.489765 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.491331 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.491418 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm874\" (UniqueName: \"kubernetes.io/projected/04a6f0a4-ca27-4a23-986b-db16468db950-kube-api-access-fm874\") pod \"heat-operator-controller-manager-67dd5f86f5-ttl8m\" (UID: \"04a6f0a4-ca27-4a23-986b-db16468db950\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.491477 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72qtd\" (UniqueName: \"kubernetes.io/projected/0a721b9a-b944-45de-8eed-3f181e09f6cf-kube-api-access-72qtd\") pod \"glance-operator-controller-manager-79df6bcc97-nlvdk\" (UID: \"0a721b9a-b944-45de-8eed-3f181e09f6cf\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.491547 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m2wf\" (UniqueName: \"kubernetes.io/projected/2247a6d3-610d-4ca0-bfe6-711bb14a22cf-kube-api-access-6m2wf\") pod \"horizon-operator-controller-manager-8464cc45fb-zxbx5\" (UID: \"2247a6d3-610d-4ca0-bfe6-711bb14a22cf\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.491567 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdxgf\" (UniqueName: \"kubernetes.io/projected/19a022dc-93c1-4899-837f-4fd78793d13d-kube-api-access-jdxgf\") pod \"designate-operator-controller-manager-588d4d986b-fznb9\" (UID: \"19a022dc-93c1-4899-837f-4fd78793d13d\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.498305 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7g9hr" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.498590 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.499635 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.511734 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.512581 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.512719 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.512960 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-l4gwq" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.513370 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.526793 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-84cxw" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.532075 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.546544 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.552286 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72qtd\" (UniqueName: \"kubernetes.io/projected/0a721b9a-b944-45de-8eed-3f181e09f6cf-kube-api-access-72qtd\") pod \"glance-operator-controller-manager-79df6bcc97-nlvdk\" (UID: \"0a721b9a-b944-45de-8eed-3f181e09f6cf\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.557210 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdxgf\" (UniqueName: \"kubernetes.io/projected/19a022dc-93c1-4899-837f-4fd78793d13d-kube-api-access-jdxgf\") pod \"designate-operator-controller-manager-588d4d986b-fznb9\" (UID: \"19a022dc-93c1-4899-837f-4fd78793d13d\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.557775 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-52ngd"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.558573 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-52ngd" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.561239 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-mzbkx" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.595372 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm874\" (UniqueName: \"kubernetes.io/projected/04a6f0a4-ca27-4a23-986b-db16468db950-kube-api-access-fm874\") pod \"heat-operator-controller-manager-67dd5f86f5-ttl8m\" (UID: \"04a6f0a4-ca27-4a23-986b-db16468db950\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.595469 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m2wf\" (UniqueName: \"kubernetes.io/projected/2247a6d3-610d-4ca0-bfe6-711bb14a22cf-kube-api-access-6m2wf\") pod \"horizon-operator-controller-manager-8464cc45fb-zxbx5\" (UID: \"2247a6d3-610d-4ca0-bfe6-711bb14a22cf\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.597849 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.619141 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.640611 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.642770 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m2wf\" (UniqueName: \"kubernetes.io/projected/2247a6d3-610d-4ca0-bfe6-711bb14a22cf-kube-api-access-6m2wf\") pod \"horizon-operator-controller-manager-8464cc45fb-zxbx5\" (UID: \"2247a6d3-610d-4ca0-bfe6-711bb14a22cf\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.656879 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm874\" (UniqueName: \"kubernetes.io/projected/04a6f0a4-ca27-4a23-986b-db16468db950-kube-api-access-fm874\") pod \"heat-operator-controller-manager-67dd5f86f5-ttl8m\" (UID: \"04a6f0a4-ca27-4a23-986b-db16468db950\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.696943 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8628t\" (UniqueName: \"kubernetes.io/projected/01ac874a-addd-4513-94c5-cd4b71fa6eb5-kube-api-access-8628t\") pod \"ironic-operator-controller-manager-6f787dddc9-kckv6\" (UID: \"01ac874a-addd-4513-94c5-cd4b71fa6eb5\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.696985 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v9z7\" (UniqueName: \"kubernetes.io/projected/efd0041f-ae6d-4c76-a7b5-092d353a029e-kube-api-access-7v9z7\") pod \"manila-operator-controller-manager-55f864c847-52ngd\" (UID: \"efd0041f-ae6d-4c76-a7b5-092d353a029e\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-52ngd" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.697023 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert\") pod \"infra-operator-controller-manager-8d4c8954d-tqkbf\" (UID: \"355388b4-930a-4dfa-bb0e-ba8b71e9e40d\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.697060 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmbz8\" (UniqueName: \"kubernetes.io/projected/5b270fa1-96a9-4edd-b984-34900c29cc1b-kube-api-access-tmbz8\") pod \"keystone-operator-controller-manager-768b96df4c-rxb8r\" (UID: \"5b270fa1-96a9-4edd-b984-34900c29cc1b\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.697109 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spggs\" (UniqueName: \"kubernetes.io/projected/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-kube-api-access-spggs\") pod \"infra-operator-controller-manager-8d4c8954d-tqkbf\" (UID: \"355388b4-930a-4dfa-bb0e-ba8b71e9e40d\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.697596 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-52ngd"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.727811 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.728569 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.728709 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.731564 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.731739 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-k4zgc" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.739454 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-nthm4"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.742062 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-nthm4" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.766016 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.766882 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.766987 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-fwkrc" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.775930 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-s2q7q" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.789047 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.790218 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.797494 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-jbw9l" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.799467 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert\") pod \"infra-operator-controller-manager-8d4c8954d-tqkbf\" (UID: \"355388b4-930a-4dfa-bb0e-ba8b71e9e40d\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.799526 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrgnk\" (UniqueName: \"kubernetes.io/projected/09c50326-bc3c-423d-a25a-39dc67a90efd-kube-api-access-mrgnk\") pod \"octavia-operator-controller-manager-5b9f45d989-55b97\" (UID: \"09c50326-bc3c-423d-a25a-39dc67a90efd\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.799583 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmbz8\" (UniqueName: \"kubernetes.io/projected/5b270fa1-96a9-4edd-b984-34900c29cc1b-kube-api-access-tmbz8\") pod \"keystone-operator-controller-manager-768b96df4c-rxb8r\" (UID: \"5b270fa1-96a9-4edd-b984-34900c29cc1b\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.799601 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l2wp\" (UniqueName: \"kubernetes.io/projected/592c8a29-a52a-4b53-86b0-318dbce9424d-kube-api-access-6l2wp\") pod \"nova-operator-controller-manager-5d488d59fb-z8nqm\" (UID: \"592c8a29-a52a-4b53-86b0-318dbce9424d\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.799617 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7nmj\" (UniqueName: \"kubernetes.io/projected/fcf8691e-a944-41ca-8ee3-23280572780e-kube-api-access-g7nmj\") pod \"neutron-operator-controller-manager-767865f676-nthm4\" (UID: \"fcf8691e-a944-41ca-8ee3-23280572780e\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-nthm4" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.799662 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spggs\" (UniqueName: \"kubernetes.io/projected/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-kube-api-access-spggs\") pod \"infra-operator-controller-manager-8d4c8954d-tqkbf\" (UID: \"355388b4-930a-4dfa-bb0e-ba8b71e9e40d\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.799708 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8628t\" (UniqueName: \"kubernetes.io/projected/01ac874a-addd-4513-94c5-cd4b71fa6eb5-kube-api-access-8628t\") pod \"ironic-operator-controller-manager-6f787dddc9-kckv6\" (UID: \"01ac874a-addd-4513-94c5-cd4b71fa6eb5\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.799726 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v9z7\" (UniqueName: \"kubernetes.io/projected/efd0041f-ae6d-4c76-a7b5-092d353a029e-kube-api-access-7v9z7\") pod \"manila-operator-controller-manager-55f864c847-52ngd\" (UID: \"efd0041f-ae6d-4c76-a7b5-092d353a029e\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-52ngd" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.799747 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnmzr\" (UniqueName: \"kubernetes.io/projected/1f45c62f-cffa-416f-bead-eace9256fa45-kube-api-access-gnmzr\") pod \"mariadb-operator-controller-manager-67ccfc9778-5vc22\" (UID: \"1f45c62f-cffa-416f-bead-eace9256fa45\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22" Mar 20 16:18:08 crc kubenswrapper[4708]: E0320 16:18:08.800008 4708 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 16:18:08 crc kubenswrapper[4708]: E0320 16:18:08.800071 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert podName:355388b4-930a-4dfa-bb0e-ba8b71e9e40d nodeName:}" failed. No retries permitted until 2026-03-20 16:18:09.300052882 +0000 UTC m=+1043.974389597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert") pod "infra-operator-controller-manager-8d4c8954d-tqkbf" (UID: "355388b4-930a-4dfa-bb0e-ba8b71e9e40d") : secret "infra-operator-webhook-server-cert" not found Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.806254 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.824845 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-hz97n"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.826298 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.831462 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-tdpgv" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.834260 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.835343 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.835702 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spggs\" (UniqueName: \"kubernetes.io/projected/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-kube-api-access-spggs\") pod \"infra-operator-controller-manager-8d4c8954d-tqkbf\" (UID: \"355388b4-930a-4dfa-bb0e-ba8b71e9e40d\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.838168 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmbz8\" (UniqueName: \"kubernetes.io/projected/5b270fa1-96a9-4edd-b984-34900c29cc1b-kube-api-access-tmbz8\") pod \"keystone-operator-controller-manager-768b96df4c-rxb8r\" (UID: \"5b270fa1-96a9-4edd-b984-34900c29cc1b\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.838789 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-sk7fv" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.838914 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.839858 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.840227 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.869788 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8628t\" (UniqueName: \"kubernetes.io/projected/01ac874a-addd-4513-94c5-cd4b71fa6eb5-kube-api-access-8628t\") pod \"ironic-operator-controller-manager-6f787dddc9-kckv6\" (UID: \"01ac874a-addd-4513-94c5-cd4b71fa6eb5\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.870148 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v9z7\" (UniqueName: \"kubernetes.io/projected/efd0041f-ae6d-4c76-a7b5-092d353a029e-kube-api-access-7v9z7\") pod \"manila-operator-controller-manager-55f864c847-52ngd\" (UID: \"efd0041f-ae6d-4c76-a7b5-092d353a029e\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-52ngd" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.853193 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.879237 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fkr9p" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.900595 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnmzr\" (UniqueName: \"kubernetes.io/projected/1f45c62f-cffa-416f-bead-eace9256fa45-kube-api-access-gnmzr\") pod \"mariadb-operator-controller-manager-67ccfc9778-5vc22\" (UID: \"1f45c62f-cffa-416f-bead-eace9256fa45\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.912618 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-nthm4"] Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.912759 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.913593 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.915363 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrgnk\" (UniqueName: \"kubernetes.io/projected/09c50326-bc3c-423d-a25a-39dc67a90efd-kube-api-access-mrgnk\") pod \"octavia-operator-controller-manager-5b9f45d989-55b97\" (UID: \"09c50326-bc3c-423d-a25a-39dc67a90efd\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.915419 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l2wp\" (UniqueName: \"kubernetes.io/projected/592c8a29-a52a-4b53-86b0-318dbce9424d-kube-api-access-6l2wp\") pod \"nova-operator-controller-manager-5d488d59fb-z8nqm\" (UID: \"592c8a29-a52a-4b53-86b0-318dbce9424d\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.915452 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7nmj\" (UniqueName: \"kubernetes.io/projected/fcf8691e-a944-41ca-8ee3-23280572780e-kube-api-access-g7nmj\") pod \"neutron-operator-controller-manager-767865f676-nthm4\" (UID: \"fcf8691e-a944-41ca-8ee3-23280572780e\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-nthm4" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.916626 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-mxn4s" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.992504 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7nmj\" (UniqueName: \"kubernetes.io/projected/fcf8691e-a944-41ca-8ee3-23280572780e-kube-api-access-g7nmj\") pod \"neutron-operator-controller-manager-767865f676-nthm4\" (UID: \"fcf8691e-a944-41ca-8ee3-23280572780e\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-nthm4" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.992519 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l2wp\" (UniqueName: \"kubernetes.io/projected/592c8a29-a52a-4b53-86b0-318dbce9424d-kube-api-access-6l2wp\") pod \"nova-operator-controller-manager-5d488d59fb-z8nqm\" (UID: \"592c8a29-a52a-4b53-86b0-318dbce9424d\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm" Mar 20 16:18:08 crc kubenswrapper[4708]: I0320 16:18:08.993361 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:08.998981 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.001395 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrgnk\" (UniqueName: \"kubernetes.io/projected/09c50326-bc3c-423d-a25a-39dc67a90efd-kube-api-access-mrgnk\") pod \"octavia-operator-controller-manager-5b9f45d989-55b97\" (UID: \"09c50326-bc3c-423d-a25a-39dc67a90efd\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.006294 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-52ngd" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.011085 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-dkfr4" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.013034 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.020043 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnmzr\" (UniqueName: \"kubernetes.io/projected/1f45c62f-cffa-416f-bead-eace9256fa45-kube-api-access-gnmzr\") pod \"mariadb-operator-controller-manager-67ccfc9778-5vc22\" (UID: \"1f45c62f-cffa-416f-bead-eace9256fa45\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.034459 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mhj7\" (UniqueName: \"kubernetes.io/projected/8589593b-d4f1-4492-8fdf-533db88caa80-kube-api-access-8mhj7\") pod \"ovn-operator-controller-manager-884679f54-hz97n\" (UID: \"8589593b-d4f1-4492-8fdf-533db88caa80\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.034502 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h62wb\" (UniqueName: \"kubernetes.io/projected/4623ed97-f89b-4be1-9d67-e1a5aecf325d-kube-api-access-h62wb\") pod \"placement-operator-controller-manager-5784578c99-pcdrn\" (UID: \"4623ed97-f89b-4be1-9d67-e1a5aecf325d\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.034535 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5dzcth\" (UID: \"26cc5de3-9cef-462a-9e28-ac485ed04178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.034554 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g876\" (UniqueName: \"kubernetes.io/projected/e1d5280f-8680-4272-bf5f-48c2239f731e-kube-api-access-5g876\") pod \"swift-operator-controller-manager-c674c5965-2lpl8\" (UID: \"e1d5280f-8680-4272-bf5f-48c2239f731e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.034614 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx4pf\" (UniqueName: \"kubernetes.io/projected/ac41d7c1-6dc3-49f7-8623-1294368145af-kube-api-access-bx4pf\") pod \"telemetry-operator-controller-manager-d6b694c5-4v48w\" (UID: \"ac41d7c1-6dc3-49f7-8623-1294368145af\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.034643 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpxzp\" (UniqueName: \"kubernetes.io/projected/26cc5de3-9cef-462a-9e28-ac485ed04178-kube-api-access-cpxzp\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5dzcth\" (UID: \"26cc5de3-9cef-462a-9e28-ac485ed04178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.038433 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-hz97n"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.063216 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.070803 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.071790 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.081609 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-bkv7w" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.086392 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.100451 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-nthm4" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.105712 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.111838 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.132015 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.132810 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.137174 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mhj7\" (UniqueName: \"kubernetes.io/projected/8589593b-d4f1-4492-8fdf-533db88caa80-kube-api-access-8mhj7\") pod \"ovn-operator-controller-manager-884679f54-hz97n\" (UID: \"8589593b-d4f1-4492-8fdf-533db88caa80\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.137231 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h62wb\" (UniqueName: \"kubernetes.io/projected/4623ed97-f89b-4be1-9d67-e1a5aecf325d-kube-api-access-h62wb\") pod \"placement-operator-controller-manager-5784578c99-pcdrn\" (UID: \"4623ed97-f89b-4be1-9d67-e1a5aecf325d\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.137282 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5dzcth\" (UID: \"26cc5de3-9cef-462a-9e28-ac485ed04178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.137302 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g876\" (UniqueName: \"kubernetes.io/projected/e1d5280f-8680-4272-bf5f-48c2239f731e-kube-api-access-5g876\") pod \"swift-operator-controller-manager-c674c5965-2lpl8\" (UID: \"e1d5280f-8680-4272-bf5f-48c2239f731e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.137920 4708 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.137967 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert podName:26cc5de3-9cef-462a-9e28-ac485ed04178 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:09.637951062 +0000 UTC m=+1044.312287777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5dzcth" (UID: "26cc5de3-9cef-462a-9e28-ac485ed04178") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.138276 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx4pf\" (UniqueName: \"kubernetes.io/projected/ac41d7c1-6dc3-49f7-8623-1294368145af-kube-api-access-bx4pf\") pod \"telemetry-operator-controller-manager-d6b694c5-4v48w\" (UID: \"ac41d7c1-6dc3-49f7-8623-1294368145af\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.138332 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpxzp\" (UniqueName: \"kubernetes.io/projected/26cc5de3-9cef-462a-9e28-ac485ed04178-kube-api-access-cpxzp\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5dzcth\" (UID: \"26cc5de3-9cef-462a-9e28-ac485ed04178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.138371 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzhsn\" (UniqueName: \"kubernetes.io/projected/9f93fb6f-7cd1-4022-a246-410d30457921-kube-api-access-jzhsn\") pod \"test-operator-controller-manager-5c5cb9c4d7-rpx5s\" (UID: \"9f93fb6f-7cd1-4022-a246-410d30457921\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.162958 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.164923 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mhj7\" (UniqueName: \"kubernetes.io/projected/8589593b-d4f1-4492-8fdf-533db88caa80-kube-api-access-8mhj7\") pod \"ovn-operator-controller-manager-884679f54-hz97n\" (UID: \"8589593b-d4f1-4492-8fdf-533db88caa80\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.167159 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpxzp\" (UniqueName: \"kubernetes.io/projected/26cc5de3-9cef-462a-9e28-ac485ed04178-kube-api-access-cpxzp\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5dzcth\" (UID: \"26cc5de3-9cef-462a-9e28-ac485ed04178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.167534 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g876\" (UniqueName: \"kubernetes.io/projected/e1d5280f-8680-4272-bf5f-48c2239f731e-kube-api-access-5g876\") pod \"swift-operator-controller-manager-c674c5965-2lpl8\" (UID: \"e1d5280f-8680-4272-bf5f-48c2239f731e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.168482 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.168612 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h62wb\" (UniqueName: \"kubernetes.io/projected/4623ed97-f89b-4be1-9d67-e1a5aecf325d-kube-api-access-h62wb\") pod \"placement-operator-controller-manager-5784578c99-pcdrn\" (UID: \"4623ed97-f89b-4be1-9d67-e1a5aecf325d\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.170425 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx4pf\" (UniqueName: \"kubernetes.io/projected/ac41d7c1-6dc3-49f7-8623-1294368145af-kube-api-access-bx4pf\") pod \"telemetry-operator-controller-manager-d6b694c5-4v48w\" (UID: \"ac41d7c1-6dc3-49f7-8623-1294368145af\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.175092 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.179663 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.181021 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.187354 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.198195 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.198967 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-hwjhw" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.218898 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.219441 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.220328 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.225835 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.225973 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hzkhj" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.226373 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.228512 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.239642 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.239723 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.239810 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxdqr\" (UniqueName: \"kubernetes.io/projected/a0b76f4f-b36a-4979-8110-db30012a6291-kube-api-access-nxdqr\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.239830 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhwx\" (UniqueName: \"kubernetes.io/projected/942f7804-64bd-4d04-badb-9cc592387608-kube-api-access-dlhwx\") pod \"watcher-operator-controller-manager-6c4d75f7f9-thrst\" (UID: \"942f7804-64bd-4d04-badb-9cc592387608\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.239860 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzhsn\" (UniqueName: \"kubernetes.io/projected/9f93fb6f-7cd1-4022-a246-410d30457921-kube-api-access-jzhsn\") pod \"test-operator-controller-manager-5c5cb9c4d7-rpx5s\" (UID: \"9f93fb6f-7cd1-4022-a246-410d30457921\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.242981 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-c5gz8" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.250367 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.281512 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzhsn\" (UniqueName: \"kubernetes.io/projected/9f93fb6f-7cd1-4022-a246-410d30457921-kube-api-access-jzhsn\") pod \"test-operator-controller-manager-5c5cb9c4d7-rpx5s\" (UID: \"9f93fb6f-7cd1-4022-a246-410d30457921\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.332018 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.343165 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.343333 4708 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.343407 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs podName:a0b76f4f-b36a-4979-8110-db30012a6291 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:09.843385824 +0000 UTC m=+1044.517722539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs") pod "openstack-operator-controller-manager-5b5b55fc46-x92n5" (UID: "a0b76f4f-b36a-4979-8110-db30012a6291") : secret "webhook-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.343338 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert\") pod \"infra-operator-controller-manager-8d4c8954d-tqkbf\" (UID: \"355388b4-930a-4dfa-bb0e-ba8b71e9e40d\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.343457 4708 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.343530 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert podName:355388b4-930a-4dfa-bb0e-ba8b71e9e40d nodeName:}" failed. No retries permitted until 2026-03-20 16:18:10.343507837 +0000 UTC m=+1045.017844592 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert") pod "infra-operator-controller-manager-8d4c8954d-tqkbf" (UID: "355388b4-930a-4dfa-bb0e-ba8b71e9e40d") : secret "infra-operator-webhook-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.343601 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxdqr\" (UniqueName: \"kubernetes.io/projected/a0b76f4f-b36a-4979-8110-db30012a6291-kube-api-access-nxdqr\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.343642 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhwx\" (UniqueName: \"kubernetes.io/projected/942f7804-64bd-4d04-badb-9cc592387608-kube-api-access-dlhwx\") pod \"watcher-operator-controller-manager-6c4d75f7f9-thrst\" (UID: \"942f7804-64bd-4d04-badb-9cc592387608\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.343832 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.344040 4708 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.344110 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs podName:a0b76f4f-b36a-4979-8110-db30012a6291 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:09.844087632 +0000 UTC m=+1044.518424347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs") pod "openstack-operator-controller-manager-5b5b55fc46-x92n5" (UID: "a0b76f4f-b36a-4979-8110-db30012a6291") : secret "metrics-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.368854 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxdqr\" (UniqueName: \"kubernetes.io/projected/a0b76f4f-b36a-4979-8110-db30012a6291-kube-api-access-nxdqr\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.376414 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhwx\" (UniqueName: \"kubernetes.io/projected/942f7804-64bd-4d04-badb-9cc592387608-kube-api-access-dlhwx\") pod \"watcher-operator-controller-manager-6c4d75f7f9-thrst\" (UID: \"942f7804-64bd-4d04-badb-9cc592387608\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.407279 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.428256 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.455747 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.493138 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.518975 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9"] Mar 20 16:18:09 crc kubenswrapper[4708]: W0320 16:18:09.549349 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19a022dc_93c1_4899_837f_4fd78793d13d.slice/crio-e150f3ce688da861d159634bc3af32833db365527c997b8ee16ee562596cd43a WatchSource:0}: Error finding container e150f3ce688da861d159634bc3af32833db365527c997b8ee16ee562596cd43a: Status 404 returned error can't find the container with id e150f3ce688da861d159634bc3af32833db365527c997b8ee16ee562596cd43a Mar 20 16:18:09 crc kubenswrapper[4708]: W0320 16:18:09.550507 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd19b5a0c_7421_4454_9c5a_d2bf4828901a.slice/crio-0397099738ec78ebf51c55ce26af3f3013ce6e8a8ebc3d44ebaa3ad8051f2aae WatchSource:0}: Error finding container 0397099738ec78ebf51c55ce26af3f3013ce6e8a8ebc3d44ebaa3ad8051f2aae: Status 404 returned error can't find the container with id 0397099738ec78ebf51c55ce26af3f3013ce6e8a8ebc3d44ebaa3ad8051f2aae Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.557020 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.660273 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5dzcth\" (UID: \"26cc5de3-9cef-462a-9e28-ac485ed04178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.660489 4708 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.660535 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert podName:26cc5de3-9cef-462a-9e28-ac485ed04178 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:10.660520516 +0000 UTC m=+1045.334857231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5dzcth" (UID: "26cc5de3-9cef-462a-9e28-ac485ed04178") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.813520 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.825254 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h"] Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.863394 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:09 crc kubenswrapper[4708]: I0320 16:18:09.863626 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.863537 4708 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.863817 4708 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.863892 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs podName:a0b76f4f-b36a-4979-8110-db30012a6291 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:10.863874882 +0000 UTC m=+1045.538211587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs") pod "openstack-operator-controller-manager-5b5b55fc46-x92n5" (UID: "a0b76f4f-b36a-4979-8110-db30012a6291") : secret "webhook-server-cert" not found Mar 20 16:18:09 crc kubenswrapper[4708]: E0320 16:18:09.863909 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs podName:a0b76f4f-b36a-4979-8110-db30012a6291 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:10.863902333 +0000 UTC m=+1045.538239048 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs") pod "openstack-operator-controller-manager-5b5b55fc46-x92n5" (UID: "a0b76f4f-b36a-4979-8110-db30012a6291") : secret "metrics-server-cert" not found Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.061824 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-52ngd"] Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.071034 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r"] Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.103924 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5"] Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.219881 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22"] Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.227868 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm"] Mar 20 16:18:10 crc kubenswrapper[4708]: W0320 16:18:10.237366 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod592c8a29_a52a_4b53_86b0_318dbce9424d.slice/crio-f821a02cdbb272fa8dd68799ca336dbe9d1295645c419c1752415f44c51f5fb5 WatchSource:0}: Error finding container f821a02cdbb272fa8dd68799ca336dbe9d1295645c419c1752415f44c51f5fb5: Status 404 returned error can't find the container with id f821a02cdbb272fa8dd68799ca336dbe9d1295645c419c1752415f44c51f5fb5 Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.243638 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-nthm4"] Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.261231 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-52ngd" event={"ID":"efd0041f-ae6d-4c76-a7b5-092d353a029e","Type":"ContainerStarted","Data":"6e3d123f657be74038b393755b376e7794076c3213acdf32cf2f37b7230b9d93"} Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.263943 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm" event={"ID":"592c8a29-a52a-4b53-86b0-318dbce9424d","Type":"ContainerStarted","Data":"f821a02cdbb272fa8dd68799ca336dbe9d1295645c419c1752415f44c51f5fb5"} Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.265184 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m" event={"ID":"04a6f0a4-ca27-4a23-986b-db16468db950","Type":"ContainerStarted","Data":"58389c5175539c87355d933788882cf361c980015432b2af4c3e981d7d9593d7"} Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.266132 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22" event={"ID":"1f45c62f-cffa-416f-bead-eace9256fa45","Type":"ContainerStarted","Data":"d1b3cbb0956672f535cac6553ba27a73f2e768ce031d38eac636e65bb38a8042"} Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.267407 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r" event={"ID":"5b270fa1-96a9-4edd-b984-34900c29cc1b","Type":"ContainerStarted","Data":"b11fa28a0fd5191ac9bcbe28cb6de1a22e6c9a72d56a30b22ae7da0083a3396d"} Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.268433 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h" event={"ID":"c39a1357-1bbd-4bad-b18d-db9e7eacad56","Type":"ContainerStarted","Data":"5ce995adf423eb80e56def6c4f1a5a3339557d401f051ae200fb220c6596952d"} Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.269433 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-nthm4" event={"ID":"fcf8691e-a944-41ca-8ee3-23280572780e","Type":"ContainerStarted","Data":"b4eb505cacb39ae58e8f8f24cd58823cfb521274cd9d60d85deb7dbb86698bd6"} Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.270546 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r" event={"ID":"d19b5a0c-7421-4454-9c5a-d2bf4828901a","Type":"ContainerStarted","Data":"0397099738ec78ebf51c55ce26af3f3013ce6e8a8ebc3d44ebaa3ad8051f2aae"} Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.271495 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5" event={"ID":"2247a6d3-610d-4ca0-bfe6-711bb14a22cf","Type":"ContainerStarted","Data":"464705804ba8692437b81ef5feb7eadb29641720df3ebc4acbbfc68dbd679ae8"} Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.272848 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9" event={"ID":"19a022dc-93c1-4899-837f-4fd78793d13d","Type":"ContainerStarted","Data":"e150f3ce688da861d159634bc3af32833db365527c997b8ee16ee562596cd43a"} Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.371212 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert\") pod \"infra-operator-controller-manager-8d4c8954d-tqkbf\" (UID: \"355388b4-930a-4dfa-bb0e-ba8b71e9e40d\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.371420 4708 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.371468 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert podName:355388b4-930a-4dfa-bb0e-ba8b71e9e40d nodeName:}" failed. No retries permitted until 2026-03-20 16:18:12.371451877 +0000 UTC m=+1047.045788592 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert") pod "infra-operator-controller-manager-8d4c8954d-tqkbf" (UID: "355388b4-930a-4dfa-bb0e-ba8b71e9e40d") : secret "infra-operator-webhook-server-cert" not found Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.377819 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s"] Mar 20 16:18:10 crc kubenswrapper[4708]: W0320 16:18:10.385121 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f93fb6f_7cd1_4022_a246_410d30457921.slice/crio-fec8ac35ea44f390c429b0edabf5df9f9c777778125ff094dfd9ecc2cd3e153d WatchSource:0}: Error finding container fec8ac35ea44f390c429b0edabf5df9f9c777778125ff094dfd9ecc2cd3e153d: Status 404 returned error can't find the container with id fec8ac35ea44f390c429b0edabf5df9f9c777778125ff094dfd9ecc2cd3e153d Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.385193 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn"] Mar 20 16:18:10 crc kubenswrapper[4708]: W0320 16:18:10.398270 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4623ed97_f89b_4be1_9d67_e1a5aecf325d.slice/crio-66c2e89f93f98fd037567982e652783c8c39f5cdbc383b7bb7e711bf4071d824 WatchSource:0}: Error finding container 66c2e89f93f98fd037567982e652783c8c39f5cdbc383b7bb7e711bf4071d824: Status 404 returned error can't find the container with id 66c2e89f93f98fd037567982e652783c8c39f5cdbc383b7bb7e711bf4071d824 Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.574207 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-hz97n"] Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.586293 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97"] Mar 20 16:18:10 crc kubenswrapper[4708]: W0320 16:18:10.604443 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09c50326_bc3c_423d_a25a_39dc67a90efd.slice/crio-1e56aff092ce163390a90ac8289f0793e1f330757c52fedbc24023bdac09c478 WatchSource:0}: Error finding container 1e56aff092ce163390a90ac8289f0793e1f330757c52fedbc24023bdac09c478: Status 404 returned error can't find the container with id 1e56aff092ce163390a90ac8289f0793e1f330757c52fedbc24023bdac09c478 Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.631959 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-72qtd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-79df6bcc97-nlvdk_openstack-operators(0a721b9a-b944-45de-8eed-3f181e09f6cf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.633065 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" podUID="0a721b9a-b944-45de-8eed-3f181e09f6cf" Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.634220 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst"] Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.642971 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6"] Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.645178 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8mhj7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-hz97n_openstack-operators(8589593b-d4f1-4492-8fdf-533db88caa80): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.646367 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" podUID="8589593b-d4f1-4492-8fdf-533db88caa80" Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.655464 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk"] Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.663888 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w"] Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.669207 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bx4pf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-4v48w_openstack-operators(ac41d7c1-6dc3-49f7-8623-1294368145af): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.669952 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8"] Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.670330 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" podUID="ac41d7c1-6dc3-49f7-8623-1294368145af" Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.671703 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5g876,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-2lpl8_openstack-operators(e1d5280f-8680-4272-bf5f-48c2239f731e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.674327 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" podUID="e1d5280f-8680-4272-bf5f-48c2239f731e" Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.676096 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5dzcth\" (UID: \"26cc5de3-9cef-462a-9e28-ac485ed04178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.676318 4708 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.676394 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert podName:26cc5de3-9cef-462a-9e28-ac485ed04178 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:12.676374913 +0000 UTC m=+1047.350711688 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5dzcth" (UID: "26cc5de3-9cef-462a-9e28-ac485ed04178") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.879766 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:10 crc kubenswrapper[4708]: I0320 16:18:10.879931 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.880070 4708 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.880120 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs podName:a0b76f4f-b36a-4979-8110-db30012a6291 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:12.880105589 +0000 UTC m=+1047.554442304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs") pod "openstack-operator-controller-manager-5b5b55fc46-x92n5" (UID: "a0b76f4f-b36a-4979-8110-db30012a6291") : secret "metrics-server-cert" not found Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.880161 4708 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 16:18:10 crc kubenswrapper[4708]: E0320 16:18:10.880180 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs podName:a0b76f4f-b36a-4979-8110-db30012a6291 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:12.880174201 +0000 UTC m=+1047.554510916 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs") pod "openstack-operator-controller-manager-5b5b55fc46-x92n5" (UID: "a0b76f4f-b36a-4979-8110-db30012a6291") : secret "webhook-server-cert" not found Mar 20 16:18:11 crc kubenswrapper[4708]: I0320 16:18:11.288414 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" event={"ID":"0a721b9a-b944-45de-8eed-3f181e09f6cf","Type":"ContainerStarted","Data":"6912fe35a8b5e2248d830614cec272f4a2f67c2f6fe7f5c43f3dc598e8ae0b78"} Mar 20 16:18:11 crc kubenswrapper[4708]: I0320 16:18:11.293118 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" event={"ID":"8589593b-d4f1-4492-8fdf-533db88caa80","Type":"ContainerStarted","Data":"90b929146c96902a3af803124c3fdc5c9c2d42a6bcc5f39fc3e98b32db113301"} Mar 20 16:18:11 crc kubenswrapper[4708]: E0320 16:18:11.294209 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" podUID="0a721b9a-b944-45de-8eed-3f181e09f6cf" Mar 20 16:18:11 crc kubenswrapper[4708]: E0320 16:18:11.296604 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" podUID="8589593b-d4f1-4492-8fdf-533db88caa80" Mar 20 16:18:11 crc kubenswrapper[4708]: I0320 16:18:11.297819 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst" event={"ID":"942f7804-64bd-4d04-badb-9cc592387608","Type":"ContainerStarted","Data":"fe4923bd431fd225ec756ae94c4e6f31f5fb24b60737e11bbb1fc38b6a7dbeaf"} Mar 20 16:18:11 crc kubenswrapper[4708]: I0320 16:18:11.301030 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" event={"ID":"ac41d7c1-6dc3-49f7-8623-1294368145af","Type":"ContainerStarted","Data":"74f9457a9eb84a468608efdcb6ddb19972b1237d7b7651a09e08818997f071de"} Mar 20 16:18:11 crc kubenswrapper[4708]: I0320 16:18:11.304984 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s" event={"ID":"9f93fb6f-7cd1-4022-a246-410d30457921","Type":"ContainerStarted","Data":"fec8ac35ea44f390c429b0edabf5df9f9c777778125ff094dfd9ecc2cd3e153d"} Mar 20 16:18:11 crc kubenswrapper[4708]: I0320 16:18:11.308433 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn" event={"ID":"4623ed97-f89b-4be1-9d67-e1a5aecf325d","Type":"ContainerStarted","Data":"66c2e89f93f98fd037567982e652783c8c39f5cdbc383b7bb7e711bf4071d824"} Mar 20 16:18:11 crc kubenswrapper[4708]: I0320 16:18:11.310423 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6" event={"ID":"01ac874a-addd-4513-94c5-cd4b71fa6eb5","Type":"ContainerStarted","Data":"be14eee667c4affee98ef57043046543312ef0dbb4599bb185575003e39f2ef3"} Mar 20 16:18:11 crc kubenswrapper[4708]: I0320 16:18:11.317218 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97" event={"ID":"09c50326-bc3c-423d-a25a-39dc67a90efd","Type":"ContainerStarted","Data":"1e56aff092ce163390a90ac8289f0793e1f330757c52fedbc24023bdac09c478"} Mar 20 16:18:11 crc kubenswrapper[4708]: E0320 16:18:11.319619 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" podUID="ac41d7c1-6dc3-49f7-8623-1294368145af" Mar 20 16:18:11 crc kubenswrapper[4708]: I0320 16:18:11.319878 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" event={"ID":"e1d5280f-8680-4272-bf5f-48c2239f731e","Type":"ContainerStarted","Data":"87b05bbc13b5b26664b9566cb00449755b1ef232ae44b5239de6625e685effa7"} Mar 20 16:18:11 crc kubenswrapper[4708]: E0320 16:18:11.326938 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" podUID="e1d5280f-8680-4272-bf5f-48c2239f731e" Mar 20 16:18:11 crc kubenswrapper[4708]: I0320 16:18:11.370111 4708 scope.go:117] "RemoveContainer" containerID="2dd799a0f98e8e8abcf4f9b4ce9724929354020078d9d4633d99a9d33d752054" Mar 20 16:18:12 crc kubenswrapper[4708]: E0320 16:18:12.343284 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" podUID="8589593b-d4f1-4492-8fdf-533db88caa80" Mar 20 16:18:12 crc kubenswrapper[4708]: E0320 16:18:12.346393 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" podUID="e1d5280f-8680-4272-bf5f-48c2239f731e" Mar 20 16:18:12 crc kubenswrapper[4708]: E0320 16:18:12.346495 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" podUID="ac41d7c1-6dc3-49f7-8623-1294368145af" Mar 20 16:18:12 crc kubenswrapper[4708]: E0320 16:18:12.350453 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" podUID="0a721b9a-b944-45de-8eed-3f181e09f6cf" Mar 20 16:18:12 crc kubenswrapper[4708]: I0320 16:18:12.413393 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert\") pod \"infra-operator-controller-manager-8d4c8954d-tqkbf\" (UID: \"355388b4-930a-4dfa-bb0e-ba8b71e9e40d\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:12 crc kubenswrapper[4708]: E0320 16:18:12.413570 4708 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 16:18:12 crc kubenswrapper[4708]: E0320 16:18:12.413660 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert podName:355388b4-930a-4dfa-bb0e-ba8b71e9e40d nodeName:}" failed. No retries permitted until 2026-03-20 16:18:16.41364092 +0000 UTC m=+1051.087977635 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert") pod "infra-operator-controller-manager-8d4c8954d-tqkbf" (UID: "355388b4-930a-4dfa-bb0e-ba8b71e9e40d") : secret "infra-operator-webhook-server-cert" not found Mar 20 16:18:12 crc kubenswrapper[4708]: I0320 16:18:12.724453 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5dzcth\" (UID: \"26cc5de3-9cef-462a-9e28-ac485ed04178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:12 crc kubenswrapper[4708]: E0320 16:18:12.724787 4708 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:18:12 crc kubenswrapper[4708]: E0320 16:18:12.724940 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert podName:26cc5de3-9cef-462a-9e28-ac485ed04178 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:16.724907747 +0000 UTC m=+1051.399244622 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5dzcth" (UID: "26cc5de3-9cef-462a-9e28-ac485ed04178") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:18:12 crc kubenswrapper[4708]: I0320 16:18:12.927267 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:12 crc kubenswrapper[4708]: I0320 16:18:12.927342 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:12 crc kubenswrapper[4708]: E0320 16:18:12.927509 4708 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 16:18:12 crc kubenswrapper[4708]: E0320 16:18:12.927556 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs podName:a0b76f4f-b36a-4979-8110-db30012a6291 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:16.927541024 +0000 UTC m=+1051.601877739 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs") pod "openstack-operator-controller-manager-5b5b55fc46-x92n5" (UID: "a0b76f4f-b36a-4979-8110-db30012a6291") : secret "webhook-server-cert" not found Mar 20 16:18:12 crc kubenswrapper[4708]: E0320 16:18:12.927883 4708 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 16:18:12 crc kubenswrapper[4708]: E0320 16:18:12.927978 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs podName:a0b76f4f-b36a-4979-8110-db30012a6291 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:16.927954604 +0000 UTC m=+1051.602291389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs") pod "openstack-operator-controller-manager-5b5b55fc46-x92n5" (UID: "a0b76f4f-b36a-4979-8110-db30012a6291") : secret "metrics-server-cert" not found Mar 20 16:18:16 crc kubenswrapper[4708]: I0320 16:18:16.490313 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert\") pod \"infra-operator-controller-manager-8d4c8954d-tqkbf\" (UID: \"355388b4-930a-4dfa-bb0e-ba8b71e9e40d\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:16 crc kubenswrapper[4708]: E0320 16:18:16.490535 4708 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 16:18:16 crc kubenswrapper[4708]: E0320 16:18:16.490792 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert podName:355388b4-930a-4dfa-bb0e-ba8b71e9e40d nodeName:}" failed. No retries permitted until 2026-03-20 16:18:24.490767543 +0000 UTC m=+1059.165104258 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert") pod "infra-operator-controller-manager-8d4c8954d-tqkbf" (UID: "355388b4-930a-4dfa-bb0e-ba8b71e9e40d") : secret "infra-operator-webhook-server-cert" not found Mar 20 16:18:16 crc kubenswrapper[4708]: I0320 16:18:16.795480 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5dzcth\" (UID: \"26cc5de3-9cef-462a-9e28-ac485ed04178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:16 crc kubenswrapper[4708]: E0320 16:18:16.795723 4708 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:18:16 crc kubenswrapper[4708]: E0320 16:18:16.795821 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert podName:26cc5de3-9cef-462a-9e28-ac485ed04178 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:24.795797062 +0000 UTC m=+1059.470133847 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5dzcth" (UID: "26cc5de3-9cef-462a-9e28-ac485ed04178") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 16:18:16 crc kubenswrapper[4708]: I0320 16:18:16.998389 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:16 crc kubenswrapper[4708]: I0320 16:18:16.998478 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:16 crc kubenswrapper[4708]: E0320 16:18:16.998745 4708 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 16:18:16 crc kubenswrapper[4708]: E0320 16:18:16.998811 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs podName:a0b76f4f-b36a-4979-8110-db30012a6291 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:24.998790119 +0000 UTC m=+1059.673126834 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs") pod "openstack-operator-controller-manager-5b5b55fc46-x92n5" (UID: "a0b76f4f-b36a-4979-8110-db30012a6291") : secret "webhook-server-cert" not found Mar 20 16:18:16 crc kubenswrapper[4708]: E0320 16:18:16.999198 4708 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 16:18:16 crc kubenswrapper[4708]: E0320 16:18:16.999235 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs podName:a0b76f4f-b36a-4979-8110-db30012a6291 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:24.999225011 +0000 UTC m=+1059.673561736 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs") pod "openstack-operator-controller-manager-5b5b55fc46-x92n5" (UID: "a0b76f4f-b36a-4979-8110-db30012a6291") : secret "metrics-server-cert" not found Mar 20 16:18:22 crc kubenswrapper[4708]: E0320 16:18:22.928491 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d" Mar 20 16:18:22 crc kubenswrapper[4708]: E0320 16:18:22.929537 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2bxrz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-59bc569d95-xr46h_openstack-operators(c39a1357-1bbd-4bad-b18d-db9e7eacad56): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:18:22 crc kubenswrapper[4708]: E0320 16:18:22.930736 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h" podUID="c39a1357-1bbd-4bad-b18d-db9e7eacad56" Mar 20 16:18:23 crc kubenswrapper[4708]: E0320 16:18:23.464322 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h" podUID="c39a1357-1bbd-4bad-b18d-db9e7eacad56" Mar 20 16:18:23 crc kubenswrapper[4708]: E0320 16:18:23.497053 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777" Mar 20 16:18:23 crc kubenswrapper[4708]: E0320 16:18:23.497271 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5xt28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d58dc466-sl77r_openstack-operators(d19b5a0c-7421-4454-9c5a-d2bf4828901a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:18:23 crc kubenswrapper[4708]: E0320 16:18:23.498536 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r" podUID="d19b5a0c-7421-4454-9c5a-d2bf4828901a" Mar 20 16:18:24 crc kubenswrapper[4708]: E0320 16:18:24.096613 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622" Mar 20 16:18:24 crc kubenswrapper[4708]: E0320 16:18:24.097167 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h62wb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-pcdrn_openstack-operators(4623ed97-f89b-4be1-9d67-e1a5aecf325d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:18:24 crc kubenswrapper[4708]: E0320 16:18:24.099603 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn" podUID="4623ed97-f89b-4be1-9d67-e1a5aecf325d" Mar 20 16:18:24 crc kubenswrapper[4708]: E0320 16:18:24.477264 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn" podUID="4623ed97-f89b-4be1-9d67-e1a5aecf325d" Mar 20 16:18:24 crc kubenswrapper[4708]: E0320 16:18:24.477330 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r" podUID="d19b5a0c-7421-4454-9c5a-d2bf4828901a" Mar 20 16:18:24 crc kubenswrapper[4708]: I0320 16:18:24.528149 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert\") pod \"infra-operator-controller-manager-8d4c8954d-tqkbf\" (UID: \"355388b4-930a-4dfa-bb0e-ba8b71e9e40d\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:24 crc kubenswrapper[4708]: I0320 16:18:24.533542 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/355388b4-930a-4dfa-bb0e-ba8b71e9e40d-cert\") pod \"infra-operator-controller-manager-8d4c8954d-tqkbf\" (UID: \"355388b4-930a-4dfa-bb0e-ba8b71e9e40d\") " pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:24 crc kubenswrapper[4708]: I0320 16:18:24.784876 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:24 crc kubenswrapper[4708]: E0320 16:18:24.807066 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad" Mar 20 16:18:24 crc kubenswrapper[4708]: E0320 16:18:24.807272 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jdxgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-588d4d986b-fznb9_openstack-operators(19a022dc-93c1-4899-837f-4fd78793d13d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:18:24 crc kubenswrapper[4708]: E0320 16:18:24.808623 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9" podUID="19a022dc-93c1-4899-837f-4fd78793d13d" Mar 20 16:18:24 crc kubenswrapper[4708]: I0320 16:18:24.831984 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5dzcth\" (UID: \"26cc5de3-9cef-462a-9e28-ac485ed04178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:24 crc kubenswrapper[4708]: I0320 16:18:24.838255 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26cc5de3-9cef-462a-9e28-ac485ed04178-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5dzcth\" (UID: \"26cc5de3-9cef-462a-9e28-ac485ed04178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:24 crc kubenswrapper[4708]: I0320 16:18:24.868302 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:25 crc kubenswrapper[4708]: I0320 16:18:25.034402 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:25 crc kubenswrapper[4708]: I0320 16:18:25.034519 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:25 crc kubenswrapper[4708]: E0320 16:18:25.034754 4708 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 16:18:25 crc kubenswrapper[4708]: E0320 16:18:25.034831 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs podName:a0b76f4f-b36a-4979-8110-db30012a6291 nodeName:}" failed. No retries permitted until 2026-03-20 16:18:41.034813558 +0000 UTC m=+1075.709150273 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs") pod "openstack-operator-controller-manager-5b5b55fc46-x92n5" (UID: "a0b76f4f-b36a-4979-8110-db30012a6291") : secret "webhook-server-cert" not found Mar 20 16:18:25 crc kubenswrapper[4708]: I0320 16:18:25.049635 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-metrics-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:25 crc kubenswrapper[4708]: E0320 16:18:25.483505 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9" podUID="19a022dc-93c1-4899-837f-4fd78793d13d" Mar 20 16:18:26 crc kubenswrapper[4708]: E0320 16:18:26.848260 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 20 16:18:26 crc kubenswrapper[4708]: E0320 16:18:26.848427 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6l2wp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-z8nqm_openstack-operators(592c8a29-a52a-4b53-86b0-318dbce9424d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:18:26 crc kubenswrapper[4708]: E0320 16:18:26.849560 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm" podUID="592c8a29-a52a-4b53-86b0-318dbce9424d" Mar 20 16:18:27 crc kubenswrapper[4708]: E0320 16:18:27.502154 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm" podUID="592c8a29-a52a-4b53-86b0-318dbce9424d" Mar 20 16:18:27 crc kubenswrapper[4708]: E0320 16:18:27.736433 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113" Mar 20 16:18:27 crc kubenswrapper[4708]: E0320 16:18:27.736756 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6m2wf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-8464cc45fb-zxbx5_openstack-operators(2247a6d3-610d-4ca0-bfe6-711bb14a22cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:18:27 crc kubenswrapper[4708]: E0320 16:18:27.737932 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5" podUID="2247a6d3-610d-4ca0-bfe6-711bb14a22cf" Mar 20 16:18:28 crc kubenswrapper[4708]: E0320 16:18:28.300460 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 16:18:28 crc kubenswrapper[4708]: E0320 16:18:28.300647 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tmbz8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-rxb8r_openstack-operators(5b270fa1-96a9-4edd-b984-34900c29cc1b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:18:28 crc kubenswrapper[4708]: E0320 16:18:28.301846 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r" podUID="5b270fa1-96a9-4edd-b984-34900c29cc1b" Mar 20 16:18:28 crc kubenswrapper[4708]: E0320 16:18:28.505912 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5" podUID="2247a6d3-610d-4ca0-bfe6-711bb14a22cf" Mar 20 16:18:28 crc kubenswrapper[4708]: E0320 16:18:28.505932 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r" podUID="5b270fa1-96a9-4edd-b984-34900c29cc1b" Mar 20 16:18:31 crc kubenswrapper[4708]: I0320 16:18:31.563085 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst" event={"ID":"942f7804-64bd-4d04-badb-9cc592387608","Type":"ContainerStarted","Data":"f24b5f5991911a422298cf80dd09bf8854f427e68ffaebc0d574c56761f875eb"} Mar 20 16:18:31 crc kubenswrapper[4708]: I0320 16:18:31.564795 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst" Mar 20 16:18:31 crc kubenswrapper[4708]: I0320 16:18:31.568360 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97" event={"ID":"09c50326-bc3c-423d-a25a-39dc67a90efd","Type":"ContainerStarted","Data":"f8805e9528487ff194b1817088e34eee446919f846dfeecb187f71e99cb90f29"} Mar 20 16:18:31 crc kubenswrapper[4708]: I0320 16:18:31.569057 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97" Mar 20 16:18:31 crc kubenswrapper[4708]: I0320 16:18:31.573030 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s" event={"ID":"9f93fb6f-7cd1-4022-a246-410d30457921","Type":"ContainerStarted","Data":"ad08f03da6c98d07fccd49fac705004bce9a003c267d578974da11b062e64602"} Mar 20 16:18:31 crc kubenswrapper[4708]: I0320 16:18:31.573248 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s" Mar 20 16:18:31 crc kubenswrapper[4708]: I0320 16:18:31.581090 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst" podStartSLOduration=5.308077386 podStartE2EDuration="23.581054128s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.606339948 +0000 UTC m=+1045.280676663" lastFinishedPulling="2026-03-20 16:18:28.87931669 +0000 UTC m=+1063.553653405" observedRunningTime="2026-03-20 16:18:31.581009777 +0000 UTC m=+1066.255346492" watchObservedRunningTime="2026-03-20 16:18:31.581054128 +0000 UTC m=+1066.255390863" Mar 20 16:18:31 crc kubenswrapper[4708]: I0320 16:18:31.602737 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf"] Mar 20 16:18:31 crc kubenswrapper[4708]: I0320 16:18:31.606645 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s" podStartSLOduration=5.114727386 podStartE2EDuration="23.606625223s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.387981859 +0000 UTC m=+1045.062318574" lastFinishedPulling="2026-03-20 16:18:28.879879696 +0000 UTC m=+1063.554216411" observedRunningTime="2026-03-20 16:18:31.595918907 +0000 UTC m=+1066.270255622" watchObservedRunningTime="2026-03-20 16:18:31.606625223 +0000 UTC m=+1066.280961948" Mar 20 16:18:31 crc kubenswrapper[4708]: I0320 16:18:31.617296 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth"] Mar 20 16:18:32 crc kubenswrapper[4708]: W0320 16:18:32.230486 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26cc5de3_9cef_462a_9e28_ac485ed04178.slice/crio-1abe0738c75004ac21792876c14fda5c00ac694680e8166e2663881081645773 WatchSource:0}: Error finding container 1abe0738c75004ac21792876c14fda5c00ac694680e8166e2663881081645773: Status 404 returned error can't find the container with id 1abe0738c75004ac21792876c14fda5c00ac694680e8166e2663881081645773 Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.581619 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-nthm4" event={"ID":"fcf8691e-a944-41ca-8ee3-23280572780e","Type":"ContainerStarted","Data":"a118dc0cf202eb0c9c295edf51106fc5384110c1d37ff59c0f5c2d42c9ad23ad"} Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.582806 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-nthm4" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.584640 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" event={"ID":"8589593b-d4f1-4492-8fdf-533db88caa80","Type":"ContainerStarted","Data":"a96597292a843449460707165f6bf6f11efc2b36150ccb5253dc804d20cd0206"} Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.585249 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.589150 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-52ngd" event={"ID":"efd0041f-ae6d-4c76-a7b5-092d353a029e","Type":"ContainerStarted","Data":"703d508705ef81a652a81c77459d090c8f58192c57a2d3d5f3859e587203eb7c"} Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.589299 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-52ngd" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.593902 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" event={"ID":"355388b4-930a-4dfa-bb0e-ba8b71e9e40d","Type":"ContainerStarted","Data":"1883b0c6de1c91298795009d9ce3e81f2939820bc2e05e155402ebec4f7dc30a"} Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.595441 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" event={"ID":"26cc5de3-9cef-462a-9e28-ac485ed04178","Type":"ContainerStarted","Data":"1abe0738c75004ac21792876c14fda5c00ac694680e8166e2663881081645773"} Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.600226 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-nthm4" podStartSLOduration=5.970944928 podStartE2EDuration="24.600203733s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.251109254 +0000 UTC m=+1044.925445969" lastFinishedPulling="2026-03-20 16:18:28.880368059 +0000 UTC m=+1063.554704774" observedRunningTime="2026-03-20 16:18:32.597191052 +0000 UTC m=+1067.271527767" watchObservedRunningTime="2026-03-20 16:18:32.600203733 +0000 UTC m=+1067.274540448" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.605134 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" event={"ID":"e1d5280f-8680-4272-bf5f-48c2239f731e","Type":"ContainerStarted","Data":"da29a4cfbdd4fd27ba0855c03c28476dd91daf1abce8b114079cc28e29ed65cd"} Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.605400 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.607624 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m" event={"ID":"04a6f0a4-ca27-4a23-986b-db16468db950","Type":"ContainerStarted","Data":"9dc76af869ec44c269b12881e13969f00fe380eeb9607255bfb7e19bbb52ff9e"} Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.607731 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.609224 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6" event={"ID":"01ac874a-addd-4513-94c5-cd4b71fa6eb5","Type":"ContainerStarted","Data":"48dc95b019b47c96772e22bc8712e19c4ad8e41eef0498c184d4ef64d9d00562"} Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.609429 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.604410 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97" podStartSLOduration=6.333072505 podStartE2EDuration="24.604392685s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.608857564 +0000 UTC m=+1045.283194279" lastFinishedPulling="2026-03-20 16:18:28.880177744 +0000 UTC m=+1063.554514459" observedRunningTime="2026-03-20 16:18:31.628002036 +0000 UTC m=+1066.302338761" watchObservedRunningTime="2026-03-20 16:18:32.604392685 +0000 UTC m=+1067.278729470" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.615087 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22" event={"ID":"1f45c62f-cffa-416f-bead-eace9256fa45","Type":"ContainerStarted","Data":"a4f67bf33dba0ac1c4aa3b195fb551803c9bbb5d06072b3d18a93f3b47759a7b"} Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.615932 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.618450 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" event={"ID":"ac41d7c1-6dc3-49f7-8623-1294368145af","Type":"ContainerStarted","Data":"541c21f6215dbc7d6f3a55446d49f3d8e29f36d663ef12380240f76cbe52e099"} Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.619076 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.620823 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" event={"ID":"0a721b9a-b944-45de-8eed-3f181e09f6cf","Type":"ContainerStarted","Data":"34d45527f656bcc8bd6998df8650ca3139ec145dbe7cbfa34efa39a35ed66bd2"} Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.621225 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.626169 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" podStartSLOduration=4.190946786 podStartE2EDuration="24.626155298s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.644376886 +0000 UTC m=+1045.318713601" lastFinishedPulling="2026-03-20 16:18:31.079585398 +0000 UTC m=+1065.753922113" observedRunningTime="2026-03-20 16:18:32.62359445 +0000 UTC m=+1067.297931165" watchObservedRunningTime="2026-03-20 16:18:32.626155298 +0000 UTC m=+1067.300492013" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.642127 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-52ngd" podStartSLOduration=5.851417166 podStartE2EDuration="24.642113005s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.088896539 +0000 UTC m=+1044.763233254" lastFinishedPulling="2026-03-20 16:18:28.879592378 +0000 UTC m=+1063.553929093" observedRunningTime="2026-03-20 16:18:32.641565561 +0000 UTC m=+1067.315902286" watchObservedRunningTime="2026-03-20 16:18:32.642113005 +0000 UTC m=+1067.316449720" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.704088 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" podStartSLOduration=4.290850232 podStartE2EDuration="24.704070924s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.667717282 +0000 UTC m=+1045.342053997" lastFinishedPulling="2026-03-20 16:18:31.080937974 +0000 UTC m=+1065.755274689" observedRunningTime="2026-03-20 16:18:32.701140406 +0000 UTC m=+1067.375477121" watchObservedRunningTime="2026-03-20 16:18:32.704070924 +0000 UTC m=+1067.378407639" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.707931 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" podStartSLOduration=4.302243107 podStartE2EDuration="24.707924848s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.671192054 +0000 UTC m=+1045.345528769" lastFinishedPulling="2026-03-20 16:18:31.076873795 +0000 UTC m=+1065.751210510" observedRunningTime="2026-03-20 16:18:32.665499712 +0000 UTC m=+1067.339836427" watchObservedRunningTime="2026-03-20 16:18:32.707924848 +0000 UTC m=+1067.382261563" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.731537 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6" podStartSLOduration=6.464338602 podStartE2EDuration="24.73151109s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.611633949 +0000 UTC m=+1045.285970664" lastFinishedPulling="2026-03-20 16:18:28.878806437 +0000 UTC m=+1063.553143152" observedRunningTime="2026-03-20 16:18:32.730934554 +0000 UTC m=+1067.405271269" watchObservedRunningTime="2026-03-20 16:18:32.73151109 +0000 UTC m=+1067.405847805" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.757073 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22" podStartSLOduration=6.135954507 podStartE2EDuration="24.757052104s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.258481171 +0000 UTC m=+1044.932817886" lastFinishedPulling="2026-03-20 16:18:28.879578768 +0000 UTC m=+1063.553915483" observedRunningTime="2026-03-20 16:18:32.750175799 +0000 UTC m=+1067.424512514" watchObservedRunningTime="2026-03-20 16:18:32.757052104 +0000 UTC m=+1067.431388819" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.787859 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" podStartSLOduration=3.110357996 podStartE2EDuration="24.787843568s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.631803619 +0000 UTC m=+1045.306140334" lastFinishedPulling="2026-03-20 16:18:32.309289191 +0000 UTC m=+1066.983625906" observedRunningTime="2026-03-20 16:18:32.786343428 +0000 UTC m=+1067.460680143" watchObservedRunningTime="2026-03-20 16:18:32.787843568 +0000 UTC m=+1067.462180273" Mar 20 16:18:32 crc kubenswrapper[4708]: I0320 16:18:32.807943 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m" podStartSLOduration=5.809919254 podStartE2EDuration="24.807920145s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:09.881568357 +0000 UTC m=+1044.555905072" lastFinishedPulling="2026-03-20 16:18:28.879569248 +0000 UTC m=+1063.553905963" observedRunningTime="2026-03-20 16:18:32.799968953 +0000 UTC m=+1067.474305668" watchObservedRunningTime="2026-03-20 16:18:32.807920145 +0000 UTC m=+1067.482256860" Mar 20 16:18:38 crc kubenswrapper[4708]: I0320 16:18:38.732542 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ttl8m" Mar 20 16:18:39 crc kubenswrapper[4708]: I0320 16:18:39.016580 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-52ngd" Mar 20 16:18:39 crc kubenswrapper[4708]: I0320 16:18:39.065547 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5vc22" Mar 20 16:18:39 crc kubenswrapper[4708]: I0320 16:18:39.103984 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-nthm4" Mar 20 16:18:39 crc kubenswrapper[4708]: I0320 16:18:39.134945 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-kckv6" Mar 20 16:18:39 crc kubenswrapper[4708]: I0320 16:18:39.190981 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-55b97" Mar 20 16:18:39 crc kubenswrapper[4708]: I0320 16:18:39.223801 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-hz97n" Mar 20 16:18:39 crc kubenswrapper[4708]: I0320 16:18:39.254705 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nlvdk" Mar 20 16:18:39 crc kubenswrapper[4708]: I0320 16:18:39.432079 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lpl8" Mar 20 16:18:39 crc kubenswrapper[4708]: I0320 16:18:39.458788 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-4v48w" Mar 20 16:18:39 crc kubenswrapper[4708]: I0320 16:18:39.497170 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rpx5s" Mar 20 16:18:39 crc kubenswrapper[4708]: I0320 16:18:39.563695 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-thrst" Mar 20 16:18:41 crc kubenswrapper[4708]: I0320 16:18:41.112930 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:41 crc kubenswrapper[4708]: I0320 16:18:41.119358 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a0b76f4f-b36a-4979-8110-db30012a6291-webhook-certs\") pod \"openstack-operator-controller-manager-5b5b55fc46-x92n5\" (UID: \"a0b76f4f-b36a-4979-8110-db30012a6291\") " pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:41 crc kubenswrapper[4708]: I0320 16:18:41.128248 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:42 crc kubenswrapper[4708]: I0320 16:18:42.074043 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5"] Mar 20 16:18:42 crc kubenswrapper[4708]: W0320 16:18:42.075468 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0b76f4f_b36a_4979_8110_db30012a6291.slice/crio-fd1439dd0f85dd62d38138284f64838b0522024e351aeadbf2053e62eaa6c27d WatchSource:0}: Error finding container fd1439dd0f85dd62d38138284f64838b0522024e351aeadbf2053e62eaa6c27d: Status 404 returned error can't find the container with id fd1439dd0f85dd62d38138284f64838b0522024e351aeadbf2053e62eaa6c27d Mar 20 16:18:42 crc kubenswrapper[4708]: I0320 16:18:42.698911 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" event={"ID":"a0b76f4f-b36a-4979-8110-db30012a6291","Type":"ContainerStarted","Data":"fd1439dd0f85dd62d38138284f64838b0522024e351aeadbf2053e62eaa6c27d"} Mar 20 16:18:43 crc kubenswrapper[4708]: I0320 16:18:43.708000 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" event={"ID":"a0b76f4f-b36a-4979-8110-db30012a6291","Type":"ContainerStarted","Data":"42b2650d7522540d970841744496c8b68b1f1f9fd01395a5088003dff397cb09"} Mar 20 16:18:43 crc kubenswrapper[4708]: I0320 16:18:43.709166 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:43 crc kubenswrapper[4708]: I0320 16:18:43.734850 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" podStartSLOduration=35.734826479 podStartE2EDuration="35.734826479s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:18:43.730587585 +0000 UTC m=+1078.404924320" watchObservedRunningTime="2026-03-20 16:18:43.734826479 +0000 UTC m=+1078.409163194" Mar 20 16:18:45 crc kubenswrapper[4708]: E0320 16:18:45.217626 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bf7cdbfb125c4327b35870f8640cbed9ddc32d6f07fedd117c6fd59f16463329" Mar 20 16:18:45 crc kubenswrapper[4708]: E0320 16:18:45.218285 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bf7cdbfb125c4327b35870f8640cbed9ddc32d6f07fedd117c6fd59f16463329,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/lmiccini/openstack-rabbitmq:r42p,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cpxzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-86657c54f5dzcth_openstack-operators(26cc5de3-9cef-462a-9e28-ac485ed04178): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:18:45 crc kubenswrapper[4708]: E0320 16:18:45.219602 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" podUID="26cc5de3-9cef-462a-9e28-ac485ed04178" Mar 20 16:18:45 crc kubenswrapper[4708]: E0320 16:18:45.731171 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bf7cdbfb125c4327b35870f8640cbed9ddc32d6f07fedd117c6fd59f16463329\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" podUID="26cc5de3-9cef-462a-9e28-ac485ed04178" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.734550 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" event={"ID":"355388b4-930a-4dfa-bb0e-ba8b71e9e40d","Type":"ContainerStarted","Data":"b3407b1f07d13bef295c40749f0e1f986eb70c57e88091394973a33500424a93"} Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.734979 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.737022 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h" event={"ID":"c39a1357-1bbd-4bad-b18d-db9e7eacad56","Type":"ContainerStarted","Data":"262330b09f03ff4b5869ad508b0e9726dfc87d563e2406398c65e88e55ff1150"} Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.737240 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.738524 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm" event={"ID":"592c8a29-a52a-4b53-86b0-318dbce9424d","Type":"ContainerStarted","Data":"9d47babd3617bf015d1fb0bae49807294d756fdf73085b0394024b7a15c50b38"} Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.738726 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.741208 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn" event={"ID":"4623ed97-f89b-4be1-9d67-e1a5aecf325d","Type":"ContainerStarted","Data":"b967666cce1677dc6a215aac9c19e90b1e093cf66c1b734b46b47ab327bd82a1"} Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.741746 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.746706 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r" event={"ID":"d19b5a0c-7421-4454-9c5a-d2bf4828901a","Type":"ContainerStarted","Data":"94220d0cb17c4d7b8a649fded2cec9b2a9d84e84f08a7ffe1f132dcc7780162c"} Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.746947 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.748374 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5" event={"ID":"2247a6d3-610d-4ca0-bfe6-711bb14a22cf","Type":"ContainerStarted","Data":"a40e0927e9162ddf55dc5162baefbff8f2d474216f237ede8bc3e4ff8fcf16ea"} Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.749297 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.751019 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r" event={"ID":"5b270fa1-96a9-4edd-b984-34900c29cc1b","Type":"ContainerStarted","Data":"b07fc07a40b1e37634a335274f0dabc3ab49c3166cb2ef2570154b425a416fc1"} Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.751328 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.753245 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9" event={"ID":"19a022dc-93c1-4899-837f-4fd78793d13d","Type":"ContainerStarted","Data":"94e5b8accd248733fe2499811080590fab0c2e50eb257c205f6b350fb63dc99b"} Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.753439 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.763166 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" podStartSLOduration=25.035146801 podStartE2EDuration="38.763149822s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:32.27263913 +0000 UTC m=+1066.946975845" lastFinishedPulling="2026-03-20 16:18:46.000642121 +0000 UTC m=+1080.674978866" observedRunningTime="2026-03-20 16:18:46.759152354 +0000 UTC m=+1081.433489069" watchObservedRunningTime="2026-03-20 16:18:46.763149822 +0000 UTC m=+1081.437486537" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.782527 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn" podStartSLOduration=3.174359629 podStartE2EDuration="38.78251037s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.402815086 +0000 UTC m=+1045.077151801" lastFinishedPulling="2026-03-20 16:18:46.010965827 +0000 UTC m=+1080.685302542" observedRunningTime="2026-03-20 16:18:46.778643026 +0000 UTC m=+1081.452979741" watchObservedRunningTime="2026-03-20 16:18:46.78251037 +0000 UTC m=+1081.456847085" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.798951 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r" podStartSLOduration=2.869173406 podStartE2EDuration="38.79893135s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.079929669 +0000 UTC m=+1044.754266384" lastFinishedPulling="2026-03-20 16:18:46.009687613 +0000 UTC m=+1080.684024328" observedRunningTime="2026-03-20 16:18:46.796380972 +0000 UTC m=+1081.470717687" watchObservedRunningTime="2026-03-20 16:18:46.79893135 +0000 UTC m=+1081.473268065" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.816890 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm" podStartSLOduration=3.045976652 podStartE2EDuration="38.816868701s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.239657107 +0000 UTC m=+1044.913993832" lastFinishedPulling="2026-03-20 16:18:46.010549166 +0000 UTC m=+1080.684885881" observedRunningTime="2026-03-20 16:18:46.816153271 +0000 UTC m=+1081.490489986" watchObservedRunningTime="2026-03-20 16:18:46.816868701 +0000 UTC m=+1081.491205416" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.840351 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9" podStartSLOduration=2.397635878 podStartE2EDuration="38.840334039s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:09.568374179 +0000 UTC m=+1044.242710894" lastFinishedPulling="2026-03-20 16:18:46.01107233 +0000 UTC m=+1080.685409055" observedRunningTime="2026-03-20 16:18:46.838692035 +0000 UTC m=+1081.513028750" watchObservedRunningTime="2026-03-20 16:18:46.840334039 +0000 UTC m=+1081.514670754" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.861150 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h" podStartSLOduration=2.825363604 podStartE2EDuration="38.861133346s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:09.877000165 +0000 UTC m=+1044.551336880" lastFinishedPulling="2026-03-20 16:18:45.912769907 +0000 UTC m=+1080.587106622" observedRunningTime="2026-03-20 16:18:46.855616018 +0000 UTC m=+1081.529952723" watchObservedRunningTime="2026-03-20 16:18:46.861133346 +0000 UTC m=+1081.535470061" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.886986 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r" podStartSLOduration=2.477944508 podStartE2EDuration="38.886964348s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:09.57964426 +0000 UTC m=+1044.253980975" lastFinishedPulling="2026-03-20 16:18:45.98866411 +0000 UTC m=+1080.663000815" observedRunningTime="2026-03-20 16:18:46.882472378 +0000 UTC m=+1081.556809113" watchObservedRunningTime="2026-03-20 16:18:46.886964348 +0000 UTC m=+1081.561301103" Mar 20 16:18:46 crc kubenswrapper[4708]: I0320 16:18:46.919297 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5" podStartSLOduration=2.990720241 podStartE2EDuration="38.919275903s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:10.082324343 +0000 UTC m=+1044.756661058" lastFinishedPulling="2026-03-20 16:18:46.010880015 +0000 UTC m=+1080.685216720" observedRunningTime="2026-03-20 16:18:46.913420847 +0000 UTC m=+1081.587757562" watchObservedRunningTime="2026-03-20 16:18:46.919275903 +0000 UTC m=+1081.593612618" Mar 20 16:18:51 crc kubenswrapper[4708]: I0320 16:18:51.136308 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5b5b55fc46-x92n5" Mar 20 16:18:54 crc kubenswrapper[4708]: I0320 16:18:54.793149 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-8d4c8954d-tqkbf" Mar 20 16:18:58 crc kubenswrapper[4708]: I0320 16:18:58.516288 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-xr46h" Mar 20 16:18:58 crc kubenswrapper[4708]: I0320 16:18:58.537977 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sl77r" Mar 20 16:18:58 crc kubenswrapper[4708]: I0320 16:18:58.623018 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fznb9" Mar 20 16:18:58 crc kubenswrapper[4708]: I0320 16:18:58.810999 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zxbx5" Mar 20 16:18:58 crc kubenswrapper[4708]: I0320 16:18:58.855376 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" event={"ID":"26cc5de3-9cef-462a-9e28-ac485ed04178","Type":"ContainerStarted","Data":"0c2c710a6f53bb6f662d34fb3c69aa231ef7dfd8e06bbdd7691a83b621b84235"} Mar 20 16:18:58 crc kubenswrapper[4708]: I0320 16:18:58.855788 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:18:58 crc kubenswrapper[4708]: I0320 16:18:58.888271 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" podStartSLOduration=25.480252102 podStartE2EDuration="50.888250672s" podCreationTimestamp="2026-03-20 16:18:08 +0000 UTC" firstStartedPulling="2026-03-20 16:18:32.273943155 +0000 UTC m=+1066.948279870" lastFinishedPulling="2026-03-20 16:18:57.681941705 +0000 UTC m=+1092.356278440" observedRunningTime="2026-03-20 16:18:58.881647386 +0000 UTC m=+1093.555984111" watchObservedRunningTime="2026-03-20 16:18:58.888250672 +0000 UTC m=+1093.562587397" Mar 20 16:18:58 crc kubenswrapper[4708]: I0320 16:18:58.918065 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-rxb8r" Mar 20 16:18:59 crc kubenswrapper[4708]: I0320 16:18:59.135117 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8nqm" Mar 20 16:18:59 crc kubenswrapper[4708]: I0320 16:18:59.335302 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pcdrn" Mar 20 16:19:04 crc kubenswrapper[4708]: I0320 16:19:04.874734 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5dzcth" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.450329 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pnrgn"] Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.454066 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.457232 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.457328 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.457249 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.460104 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-v75xz" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.489825 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pnrgn"] Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.527057 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vc8cm"] Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.528232 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.534809 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.548254 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vc8cm"] Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.559537 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-config\") pod \"dnsmasq-dns-78dd6ddcc-vc8cm\" (UID: \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.559588 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfkr9\" (UniqueName: \"kubernetes.io/projected/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-kube-api-access-rfkr9\") pod \"dnsmasq-dns-78dd6ddcc-vc8cm\" (UID: \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.559610 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vc8cm\" (UID: \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.559636 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7-config\") pod \"dnsmasq-dns-675f4bcbfc-pnrgn\" (UID: \"20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.559724 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv6p5\" (UniqueName: \"kubernetes.io/projected/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7-kube-api-access-hv6p5\") pod \"dnsmasq-dns-675f4bcbfc-pnrgn\" (UID: \"20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.661279 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv6p5\" (UniqueName: \"kubernetes.io/projected/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7-kube-api-access-hv6p5\") pod \"dnsmasq-dns-675f4bcbfc-pnrgn\" (UID: \"20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.661402 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-config\") pod \"dnsmasq-dns-78dd6ddcc-vc8cm\" (UID: \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.661441 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfkr9\" (UniqueName: \"kubernetes.io/projected/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-kube-api-access-rfkr9\") pod \"dnsmasq-dns-78dd6ddcc-vc8cm\" (UID: \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.661467 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vc8cm\" (UID: \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.661498 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7-config\") pod \"dnsmasq-dns-675f4bcbfc-pnrgn\" (UID: \"20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.662480 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7-config\") pod \"dnsmasq-dns-675f4bcbfc-pnrgn\" (UID: \"20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.662637 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-config\") pod \"dnsmasq-dns-78dd6ddcc-vc8cm\" (UID: \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.662698 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vc8cm\" (UID: \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.679695 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv6p5\" (UniqueName: \"kubernetes.io/projected/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7-kube-api-access-hv6p5\") pod \"dnsmasq-dns-675f4bcbfc-pnrgn\" (UID: \"20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.680304 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfkr9\" (UniqueName: \"kubernetes.io/projected/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-kube-api-access-rfkr9\") pod \"dnsmasq-dns-78dd6ddcc-vc8cm\" (UID: \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.781084 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" Mar 20 16:19:24 crc kubenswrapper[4708]: I0320 16:19:24.845681 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:25 crc kubenswrapper[4708]: I0320 16:19:25.285995 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pnrgn"] Mar 20 16:19:25 crc kubenswrapper[4708]: W0320 16:19:25.289804 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20ea7cac_10d2_47f5_8ecd_bef3d4ef8bc7.slice/crio-f6c9a012662ac3626e242d8e955b0b58ac705e04276146a214f2d58b24b2f5d1 WatchSource:0}: Error finding container f6c9a012662ac3626e242d8e955b0b58ac705e04276146a214f2d58b24b2f5d1: Status 404 returned error can't find the container with id f6c9a012662ac3626e242d8e955b0b58ac705e04276146a214f2d58b24b2f5d1 Mar 20 16:19:25 crc kubenswrapper[4708]: I0320 16:19:25.358903 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vc8cm"] Mar 20 16:19:26 crc kubenswrapper[4708]: I0320 16:19:26.055404 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" event={"ID":"20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7","Type":"ContainerStarted","Data":"f6c9a012662ac3626e242d8e955b0b58ac705e04276146a214f2d58b24b2f5d1"} Mar 20 16:19:26 crc kubenswrapper[4708]: I0320 16:19:26.057568 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" event={"ID":"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a","Type":"ContainerStarted","Data":"d45d856fe711974ae46a332d8f4ef84ca80ce59b5f767d0cbf18fa6179b46a74"} Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.250285 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pnrgn"] Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.282306 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-fzv8c"] Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.283850 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.297092 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-fzv8c"] Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.424180 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ccae0c-99a6-4534-865d-65b3a96a5832-config\") pod \"dnsmasq-dns-666b6646f7-fzv8c\" (UID: \"05ccae0c-99a6-4534-865d-65b3a96a5832\") " pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.424608 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p5vd\" (UniqueName: \"kubernetes.io/projected/05ccae0c-99a6-4534-865d-65b3a96a5832-kube-api-access-6p5vd\") pod \"dnsmasq-dns-666b6646f7-fzv8c\" (UID: \"05ccae0c-99a6-4534-865d-65b3a96a5832\") " pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.424748 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05ccae0c-99a6-4534-865d-65b3a96a5832-dns-svc\") pod \"dnsmasq-dns-666b6646f7-fzv8c\" (UID: \"05ccae0c-99a6-4534-865d-65b3a96a5832\") " pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.526572 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p5vd\" (UniqueName: \"kubernetes.io/projected/05ccae0c-99a6-4534-865d-65b3a96a5832-kube-api-access-6p5vd\") pod \"dnsmasq-dns-666b6646f7-fzv8c\" (UID: \"05ccae0c-99a6-4534-865d-65b3a96a5832\") " pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.526654 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05ccae0c-99a6-4534-865d-65b3a96a5832-dns-svc\") pod \"dnsmasq-dns-666b6646f7-fzv8c\" (UID: \"05ccae0c-99a6-4534-865d-65b3a96a5832\") " pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.526769 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ccae0c-99a6-4534-865d-65b3a96a5832-config\") pod \"dnsmasq-dns-666b6646f7-fzv8c\" (UID: \"05ccae0c-99a6-4534-865d-65b3a96a5832\") " pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.528772 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ccae0c-99a6-4534-865d-65b3a96a5832-config\") pod \"dnsmasq-dns-666b6646f7-fzv8c\" (UID: \"05ccae0c-99a6-4534-865d-65b3a96a5832\") " pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.531185 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05ccae0c-99a6-4534-865d-65b3a96a5832-dns-svc\") pod \"dnsmasq-dns-666b6646f7-fzv8c\" (UID: \"05ccae0c-99a6-4534-865d-65b3a96a5832\") " pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.565952 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p5vd\" (UniqueName: \"kubernetes.io/projected/05ccae0c-99a6-4534-865d-65b3a96a5832-kube-api-access-6p5vd\") pod \"dnsmasq-dns-666b6646f7-fzv8c\" (UID: \"05ccae0c-99a6-4534-865d-65b3a96a5832\") " pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.626724 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.684954 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vc8cm"] Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.721362 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-5j22k"] Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.722931 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.734518 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-5j22k"] Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.834481 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-config\") pod \"dnsmasq-dns-57d769cc4f-5j22k\" (UID: \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\") " pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.834576 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-5j22k\" (UID: \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\") " pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.834638 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl9mk\" (UniqueName: \"kubernetes.io/projected/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-kube-api-access-xl9mk\") pod \"dnsmasq-dns-57d769cc4f-5j22k\" (UID: \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\") " pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.936640 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-config\") pod \"dnsmasq-dns-57d769cc4f-5j22k\" (UID: \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\") " pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.937130 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-5j22k\" (UID: \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\") " pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.937177 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl9mk\" (UniqueName: \"kubernetes.io/projected/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-kube-api-access-xl9mk\") pod \"dnsmasq-dns-57d769cc4f-5j22k\" (UID: \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\") " pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.937970 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-5j22k\" (UID: \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\") " pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.937993 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-config\") pod \"dnsmasq-dns-57d769cc4f-5j22k\" (UID: \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\") " pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:19:27 crc kubenswrapper[4708]: I0320 16:19:27.960030 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl9mk\" (UniqueName: \"kubernetes.io/projected/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-kube-api-access-xl9mk\") pod \"dnsmasq-dns-57d769cc4f-5j22k\" (UID: \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\") " pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.060186 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.227938 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.229310 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.231463 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.231637 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.232273 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.232282 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.232368 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.233990 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.234005 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x4phx" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.256435 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 16:19:28 crc kubenswrapper[4708]: W0320 16:19:28.334325 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05ccae0c_99a6_4534_865d_65b3a96a5832.slice/crio-a6c8cbecf0f1265aaee62d641f0c1914f9eb20486d1f02ae92d66a996411f823 WatchSource:0}: Error finding container a6c8cbecf0f1265aaee62d641f0c1914f9eb20486d1f02ae92d66a996411f823: Status 404 returned error can't find the container with id a6c8cbecf0f1265aaee62d641f0c1914f9eb20486d1f02ae92d66a996411f823 Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.343722 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.343789 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1da957bf-1f80-4bef-9033-333fa60118c3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.343809 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1da957bf-1f80-4bef-9033-333fa60118c3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.343857 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1da957bf-1f80-4bef-9033-333fa60118c3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.343885 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1da957bf-1f80-4bef-9033-333fa60118c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.343918 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1da957bf-1f80-4bef-9033-333fa60118c3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.343959 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1da957bf-1f80-4bef-9033-333fa60118c3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.343983 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb87s\" (UniqueName: \"kubernetes.io/projected/1da957bf-1f80-4bef-9033-333fa60118c3-kube-api-access-gb87s\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.344009 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1da957bf-1f80-4bef-9033-333fa60118c3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.344051 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1da957bf-1f80-4bef-9033-333fa60118c3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.344083 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1da957bf-1f80-4bef-9033-333fa60118c3-config-data\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.352159 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-fzv8c"] Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.445994 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1da957bf-1f80-4bef-9033-333fa60118c3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.446367 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1da957bf-1f80-4bef-9033-333fa60118c3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.446397 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1da957bf-1f80-4bef-9033-333fa60118c3-config-data\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.446451 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.446485 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1da957bf-1f80-4bef-9033-333fa60118c3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.446505 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1da957bf-1f80-4bef-9033-333fa60118c3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.446553 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1da957bf-1f80-4bef-9033-333fa60118c3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.446634 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1da957bf-1f80-4bef-9033-333fa60118c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.446662 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1da957bf-1f80-4bef-9033-333fa60118c3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.446697 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1da957bf-1f80-4bef-9033-333fa60118c3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.446714 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb87s\" (UniqueName: \"kubernetes.io/projected/1da957bf-1f80-4bef-9033-333fa60118c3-kube-api-access-gb87s\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.446959 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.447488 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1da957bf-1f80-4bef-9033-333fa60118c3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.447644 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1da957bf-1f80-4bef-9033-333fa60118c3-config-data\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.447862 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1da957bf-1f80-4bef-9033-333fa60118c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.448106 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1da957bf-1f80-4bef-9033-333fa60118c3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.448662 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1da957bf-1f80-4bef-9033-333fa60118c3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.460528 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1da957bf-1f80-4bef-9033-333fa60118c3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.463040 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1da957bf-1f80-4bef-9033-333fa60118c3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.463241 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1da957bf-1f80-4bef-9033-333fa60118c3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.464495 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb87s\" (UniqueName: \"kubernetes.io/projected/1da957bf-1f80-4bef-9033-333fa60118c3-kube-api-access-gb87s\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.464715 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1da957bf-1f80-4bef-9033-333fa60118c3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.468279 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1da957bf-1f80-4bef-9033-333fa60118c3\") " pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.556166 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.615358 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-5j22k"] Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.653392 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.661738 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.664949 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cdr48" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.665222 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.666828 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.666987 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.667128 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.667344 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.668179 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.681331 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.752223 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/103fe6f4-2ac5-430b-9ce4-2d142b273674-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.752289 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/103fe6f4-2ac5-430b-9ce4-2d142b273674-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.752355 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/103fe6f4-2ac5-430b-9ce4-2d142b273674-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.752394 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/103fe6f4-2ac5-430b-9ce4-2d142b273674-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.752450 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.752504 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/103fe6f4-2ac5-430b-9ce4-2d142b273674-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.752534 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/103fe6f4-2ac5-430b-9ce4-2d142b273674-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.752582 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/103fe6f4-2ac5-430b-9ce4-2d142b273674-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.752606 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/103fe6f4-2ac5-430b-9ce4-2d142b273674-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.752653 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/103fe6f4-2ac5-430b-9ce4-2d142b273674-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.752698 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvhsk\" (UniqueName: \"kubernetes.io/projected/103fe6f4-2ac5-430b-9ce4-2d142b273674-kube-api-access-lvhsk\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.856555 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/103fe6f4-2ac5-430b-9ce4-2d142b273674-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.856643 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/103fe6f4-2ac5-430b-9ce4-2d142b273674-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.856686 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/103fe6f4-2ac5-430b-9ce4-2d142b273674-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.856704 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.856721 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/103fe6f4-2ac5-430b-9ce4-2d142b273674-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.856745 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/103fe6f4-2ac5-430b-9ce4-2d142b273674-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.856777 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/103fe6f4-2ac5-430b-9ce4-2d142b273674-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.856792 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/103fe6f4-2ac5-430b-9ce4-2d142b273674-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.856816 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/103fe6f4-2ac5-430b-9ce4-2d142b273674-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.856836 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvhsk\" (UniqueName: \"kubernetes.io/projected/103fe6f4-2ac5-430b-9ce4-2d142b273674-kube-api-access-lvhsk\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.856877 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/103fe6f4-2ac5-430b-9ce4-2d142b273674-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.857867 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/103fe6f4-2ac5-430b-9ce4-2d142b273674-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.858213 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.858724 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/103fe6f4-2ac5-430b-9ce4-2d142b273674-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.859131 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/103fe6f4-2ac5-430b-9ce4-2d142b273674-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.859198 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/103fe6f4-2ac5-430b-9ce4-2d142b273674-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.859837 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/103fe6f4-2ac5-430b-9ce4-2d142b273674-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.864547 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/103fe6f4-2ac5-430b-9ce4-2d142b273674-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.864814 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/103fe6f4-2ac5-430b-9ce4-2d142b273674-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.865286 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/103fe6f4-2ac5-430b-9ce4-2d142b273674-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.875837 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvhsk\" (UniqueName: \"kubernetes.io/projected/103fe6f4-2ac5-430b-9ce4-2d142b273674-kube-api-access-lvhsk\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.893810 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/103fe6f4-2ac5-430b-9ce4-2d142b273674-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:28 crc kubenswrapper[4708]: I0320 16:19:28.907190 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"103fe6f4-2ac5-430b-9ce4-2d142b273674\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:29 crc kubenswrapper[4708]: I0320 16:19:29.003072 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:19:29 crc kubenswrapper[4708]: I0320 16:19:29.121558 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" event={"ID":"295d15f2-fb02-4b05-a59a-40f8e5c2a60e","Type":"ContainerStarted","Data":"55b4d76f568aa4969b8ae339ff20c4a6b493ed21617e8cfcc1ca5b4dced349c9"} Mar 20 16:19:29 crc kubenswrapper[4708]: I0320 16:19:29.126091 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" event={"ID":"05ccae0c-99a6-4534-865d-65b3a96a5832","Type":"ContainerStarted","Data":"a6c8cbecf0f1265aaee62d641f0c1914f9eb20486d1f02ae92d66a996411f823"} Mar 20 16:19:29 crc kubenswrapper[4708]: I0320 16:19:29.245914 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 16:19:29 crc kubenswrapper[4708]: I0320 16:19:29.612966 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.103918 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.105989 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.109003 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.109202 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.109340 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.113754 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-mbn8l" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.123289 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.190605 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052c07b4-fc8c-45df-9294-d6217de2f52c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.190700 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.190730 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/052c07b4-fc8c-45df-9294-d6217de2f52c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.190751 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/052c07b4-fc8c-45df-9294-d6217de2f52c-kolla-config\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.190787 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcs9r\" (UniqueName: \"kubernetes.io/projected/052c07b4-fc8c-45df-9294-d6217de2f52c-kube-api-access-dcs9r\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.190834 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/052c07b4-fc8c-45df-9294-d6217de2f52c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.191141 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/052c07b4-fc8c-45df-9294-d6217de2f52c-config-data-default\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.191206 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/052c07b4-fc8c-45df-9294-d6217de2f52c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.202818 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.213387 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1da957bf-1f80-4bef-9033-333fa60118c3","Type":"ContainerStarted","Data":"5eefc29fed74c731f1fe441568cbbbcd0af42700816aeac40a10882fb696d17e"} Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.294134 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/052c07b4-fc8c-45df-9294-d6217de2f52c-config-data-default\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.294246 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/052c07b4-fc8c-45df-9294-d6217de2f52c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.294334 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052c07b4-fc8c-45df-9294-d6217de2f52c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.294369 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.294423 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/052c07b4-fc8c-45df-9294-d6217de2f52c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.294444 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/052c07b4-fc8c-45df-9294-d6217de2f52c-kolla-config\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.294497 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcs9r\" (UniqueName: \"kubernetes.io/projected/052c07b4-fc8c-45df-9294-d6217de2f52c-kube-api-access-dcs9r\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.294560 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/052c07b4-fc8c-45df-9294-d6217de2f52c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.296252 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.309616 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/052c07b4-fc8c-45df-9294-d6217de2f52c-kolla-config\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.310218 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/052c07b4-fc8c-45df-9294-d6217de2f52c-config-data-default\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.311457 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/052c07b4-fc8c-45df-9294-d6217de2f52c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.313568 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/052c07b4-fc8c-45df-9294-d6217de2f52c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.323261 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/052c07b4-fc8c-45df-9294-d6217de2f52c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.338175 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052c07b4-fc8c-45df-9294-d6217de2f52c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.339064 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcs9r\" (UniqueName: \"kubernetes.io/projected/052c07b4-fc8c-45df-9294-d6217de2f52c-kube-api-access-dcs9r\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.392544 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"052c07b4-fc8c-45df-9294-d6217de2f52c\") " pod="openstack/openstack-galera-0" Mar 20 16:19:30 crc kubenswrapper[4708]: I0320 16:19:30.443402 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.364053 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.366166 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.370352 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.371141 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.372152 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-hr4vl" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.372316 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.410613 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.436140 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98e85ea9-8f00-458b-9016-ef5c4b9569f7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.436230 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98e85ea9-8f00-458b-9016-ef5c4b9569f7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.436262 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98e85ea9-8f00-458b-9016-ef5c4b9569f7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.436279 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98e85ea9-8f00-458b-9016-ef5c4b9569f7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.436323 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e85ea9-8f00-458b-9016-ef5c4b9569f7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.436595 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkzns\" (UniqueName: \"kubernetes.io/projected/98e85ea9-8f00-458b-9016-ef5c4b9569f7-kube-api-access-hkzns\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.436824 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98e85ea9-8f00-458b-9016-ef5c4b9569f7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.436879 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.539826 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkzns\" (UniqueName: \"kubernetes.io/projected/98e85ea9-8f00-458b-9016-ef5c4b9569f7-kube-api-access-hkzns\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.539883 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98e85ea9-8f00-458b-9016-ef5c4b9569f7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.539911 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.539944 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98e85ea9-8f00-458b-9016-ef5c4b9569f7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.539983 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98e85ea9-8f00-458b-9016-ef5c4b9569f7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.540004 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98e85ea9-8f00-458b-9016-ef5c4b9569f7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.540026 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98e85ea9-8f00-458b-9016-ef5c4b9569f7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.540066 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e85ea9-8f00-458b-9016-ef5c4b9569f7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.540877 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98e85ea9-8f00-458b-9016-ef5c4b9569f7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.541792 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98e85ea9-8f00-458b-9016-ef5c4b9569f7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.542298 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98e85ea9-8f00-458b-9016-ef5c4b9569f7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.542478 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.546131 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98e85ea9-8f00-458b-9016-ef5c4b9569f7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.552505 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e85ea9-8f00-458b-9016-ef5c4b9569f7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.563135 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98e85ea9-8f00-458b-9016-ef5c4b9569f7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.570290 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkzns\" (UniqueName: \"kubernetes.io/projected/98e85ea9-8f00-458b-9016-ef5c4b9569f7-kube-api-access-hkzns\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.600522 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.603130 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.608543 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"98e85ea9-8f00-458b-9016-ef5c4b9569f7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.611101 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-qtpd2" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.611436 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.612291 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.648294 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80d56a8-3037-4d99-afd9-61aeecc4259c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.648483 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d80d56a8-3037-4d99-afd9-61aeecc4259c-kolla-config\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.648624 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlfxw\" (UniqueName: \"kubernetes.io/projected/d80d56a8-3037-4d99-afd9-61aeecc4259c-kube-api-access-tlfxw\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.648646 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d80d56a8-3037-4d99-afd9-61aeecc4259c-config-data\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.648705 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d80d56a8-3037-4d99-afd9-61aeecc4259c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.648879 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.721208 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.756194 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80d56a8-3037-4d99-afd9-61aeecc4259c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.756280 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d80d56a8-3037-4d99-afd9-61aeecc4259c-kolla-config\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.756329 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlfxw\" (UniqueName: \"kubernetes.io/projected/d80d56a8-3037-4d99-afd9-61aeecc4259c-kube-api-access-tlfxw\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.756352 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d80d56a8-3037-4d99-afd9-61aeecc4259c-config-data\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.756384 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d80d56a8-3037-4d99-afd9-61aeecc4259c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.757983 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d80d56a8-3037-4d99-afd9-61aeecc4259c-kolla-config\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.759171 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d80d56a8-3037-4d99-afd9-61aeecc4259c-config-data\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.762078 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d80d56a8-3037-4d99-afd9-61aeecc4259c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.762892 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80d56a8-3037-4d99-afd9-61aeecc4259c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.787054 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlfxw\" (UniqueName: \"kubernetes.io/projected/d80d56a8-3037-4d99-afd9-61aeecc4259c-kube-api-access-tlfxw\") pod \"memcached-0\" (UID: \"d80d56a8-3037-4d99-afd9-61aeecc4259c\") " pod="openstack/memcached-0" Mar 20 16:19:31 crc kubenswrapper[4708]: I0320 16:19:31.985034 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 16:19:33 crc kubenswrapper[4708]: I0320 16:19:33.951925 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:19:33 crc kubenswrapper[4708]: I0320 16:19:33.955626 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 16:19:33 crc kubenswrapper[4708]: I0320 16:19:33.961259 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-lhghw" Mar 20 16:19:33 crc kubenswrapper[4708]: I0320 16:19:33.965121 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:19:34 crc kubenswrapper[4708]: I0320 16:19:34.008101 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxnhb\" (UniqueName: \"kubernetes.io/projected/301fee8b-2c0b-46c6-810c-a89d85b4efb4-kube-api-access-rxnhb\") pod \"kube-state-metrics-0\" (UID: \"301fee8b-2c0b-46c6-810c-a89d85b4efb4\") " pod="openstack/kube-state-metrics-0" Mar 20 16:19:34 crc kubenswrapper[4708]: I0320 16:19:34.109609 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxnhb\" (UniqueName: \"kubernetes.io/projected/301fee8b-2c0b-46c6-810c-a89d85b4efb4-kube-api-access-rxnhb\") pod \"kube-state-metrics-0\" (UID: \"301fee8b-2c0b-46c6-810c-a89d85b4efb4\") " pod="openstack/kube-state-metrics-0" Mar 20 16:19:34 crc kubenswrapper[4708]: I0320 16:19:34.129440 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxnhb\" (UniqueName: \"kubernetes.io/projected/301fee8b-2c0b-46c6-810c-a89d85b4efb4-kube-api-access-rxnhb\") pod \"kube-state-metrics-0\" (UID: \"301fee8b-2c0b-46c6-810c-a89d85b4efb4\") " pod="openstack/kube-state-metrics-0" Mar 20 16:19:34 crc kubenswrapper[4708]: I0320 16:19:34.318239 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 16:19:35 crc kubenswrapper[4708]: I0320 16:19:35.294236 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"103fe6f4-2ac5-430b-9ce4-2d142b273674","Type":"ContainerStarted","Data":"46750042f84e5275ae83cf573d9db6dcd0ba84ba4f1aa8387419eed7111250b0"} Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.086016 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-22qr2"] Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.088160 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.092079 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-wvrd6" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.092370 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.092496 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.097508 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-22qr2"] Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.119706 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-w47cd"] Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.121244 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.130271 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-w47cd"] Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279534 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-var-run\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279583 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-ovn-controller-tls-certs\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279610 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-combined-ca-bundle\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279688 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-scripts\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279706 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-var-run-ovn\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279724 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-etc-ovs\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279747 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-var-log-ovn\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279770 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-var-run\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279807 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-var-lib\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279829 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l2sj\" (UniqueName: \"kubernetes.io/projected/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-kube-api-access-5l2sj\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279873 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-var-log\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279895 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkds7\" (UniqueName: \"kubernetes.io/projected/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-kube-api-access-zkds7\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.279909 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-scripts\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.382553 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-var-run\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.382650 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-ovn-controller-tls-certs\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.382702 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-combined-ca-bundle\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.382805 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-scripts\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.382832 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-var-run-ovn\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.382865 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-etc-ovs\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.382907 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-var-log-ovn\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.382999 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-var-run\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.383046 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-var-lib\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.383078 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l2sj\" (UniqueName: \"kubernetes.io/projected/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-kube-api-access-5l2sj\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.383166 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-var-log\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.383219 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-var-run\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.383279 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkds7\" (UniqueName: \"kubernetes.io/projected/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-kube-api-access-zkds7\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.383318 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-scripts\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.383359 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-var-log-ovn\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.383597 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-var-run\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.383847 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-var-lib\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.385243 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-scripts\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.385342 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-var-run-ovn\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.385451 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-etc-ovs\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.386072 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-var-log\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.388038 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-scripts\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.392565 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-combined-ca-bundle\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.401640 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-ovn-controller-tls-certs\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.424422 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.425644 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.426393 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkds7\" (UniqueName: \"kubernetes.io/projected/c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0-kube-api-access-zkds7\") pod \"ovn-controller-ovs-w47cd\" (UID: \"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0\") " pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.432838 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.433033 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-lmdw6" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.433160 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.433337 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.439840 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.440510 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l2sj\" (UniqueName: \"kubernetes.io/projected/8ad8d5cb-c681-406d-8dee-25f0a0f71b83-kube-api-access-5l2sj\") pod \"ovn-controller-22qr2\" (UID: \"8ad8d5cb-c681-406d-8dee-25f0a0f71b83\") " pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.447614 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.449109 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.586398 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9db73214-be2f-4e2e-b703-8cc42aa0c86a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.586451 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9db73214-be2f-4e2e-b703-8cc42aa0c86a-config\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.586470 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db73214-be2f-4e2e-b703-8cc42aa0c86a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.586493 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9db73214-be2f-4e2e-b703-8cc42aa0c86a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.586528 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcq8q\" (UniqueName: \"kubernetes.io/projected/9db73214-be2f-4e2e-b703-8cc42aa0c86a-kube-api-access-mcq8q\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.586565 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.586608 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9db73214-be2f-4e2e-b703-8cc42aa0c86a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.586631 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9db73214-be2f-4e2e-b703-8cc42aa0c86a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.687848 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcq8q\" (UniqueName: \"kubernetes.io/projected/9db73214-be2f-4e2e-b703-8cc42aa0c86a-kube-api-access-mcq8q\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.689185 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.689267 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9db73214-be2f-4e2e-b703-8cc42aa0c86a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.689297 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9db73214-be2f-4e2e-b703-8cc42aa0c86a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.689360 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9db73214-be2f-4e2e-b703-8cc42aa0c86a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.689394 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9db73214-be2f-4e2e-b703-8cc42aa0c86a-config\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.689409 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db73214-be2f-4e2e-b703-8cc42aa0c86a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.689431 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9db73214-be2f-4e2e-b703-8cc42aa0c86a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.691203 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9db73214-be2f-4e2e-b703-8cc42aa0c86a-config\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.692103 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9db73214-be2f-4e2e-b703-8cc42aa0c86a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.692531 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.694108 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9db73214-be2f-4e2e-b703-8cc42aa0c86a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.694422 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9db73214-be2f-4e2e-b703-8cc42aa0c86a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.696963 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9db73214-be2f-4e2e-b703-8cc42aa0c86a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.711793 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db73214-be2f-4e2e-b703-8cc42aa0c86a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.713664 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-22qr2" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.720237 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.725897 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcq8q\" (UniqueName: \"kubernetes.io/projected/9db73214-be2f-4e2e-b703-8cc42aa0c86a-kube-api-access-mcq8q\") pod \"ovsdbserver-nb-0\" (UID: \"9db73214-be2f-4e2e-b703-8cc42aa0c86a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:37 crc kubenswrapper[4708]: I0320 16:19:37.824036 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.447864 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.450155 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.453351 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.453613 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.458663 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-m2wp9" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.459078 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.465460 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.575402 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-config\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.575454 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmlbm\" (UniqueName: \"kubernetes.io/projected/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-kube-api-access-dmlbm\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.575493 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.575513 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.575578 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.575624 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.575693 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.575726 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.677282 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.677372 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-config\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.677392 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmlbm\" (UniqueName: \"kubernetes.io/projected/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-kube-api-access-dmlbm\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.677491 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.677593 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.677785 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.677874 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.677914 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.677997 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.678338 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.678792 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-config\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.678804 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.690681 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.690968 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.691259 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.700916 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmlbm\" (UniqueName: \"kubernetes.io/projected/bdb983a7-d139-4a8b-bbb5-6e65999c6be5-kube-api-access-dmlbm\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.705543 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bdb983a7-d139-4a8b-bbb5-6e65999c6be5\") " pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:41 crc kubenswrapper[4708]: I0320 16:19:41.784338 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 16:19:48 crc kubenswrapper[4708]: E0320 16:19:48.573270 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 16:19:48 crc kubenswrapper[4708]: E0320 16:19:48.573953 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xl9mk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-5j22k_openstack(295d15f2-fb02-4b05-a59a-40f8e5c2a60e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:19:48 crc kubenswrapper[4708]: E0320 16:19:48.575066 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" podUID="295d15f2-fb02-4b05-a59a-40f8e5c2a60e" Mar 20 16:19:48 crc kubenswrapper[4708]: E0320 16:19:48.626986 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 16:19:48 crc kubenswrapper[4708]: E0320 16:19:48.627410 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6p5vd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-fzv8c_openstack(05ccae0c-99a6-4534-865d-65b3a96a5832): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:19:48 crc kubenswrapper[4708]: E0320 16:19:48.629234 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" podUID="05ccae0c-99a6-4534-865d-65b3a96a5832" Mar 20 16:19:48 crc kubenswrapper[4708]: E0320 16:19:48.661535 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 16:19:48 crc kubenswrapper[4708]: E0320 16:19:48.661717 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rfkr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-vc8cm_openstack(9d3c07e9-bc34-44ff-8eee-2777b4cebc9a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:19:48 crc kubenswrapper[4708]: E0320 16:19:48.662935 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" podUID="9d3c07e9-bc34-44ff-8eee-2777b4cebc9a" Mar 20 16:19:48 crc kubenswrapper[4708]: E0320 16:19:48.680408 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 16:19:48 crc kubenswrapper[4708]: E0320 16:19:48.680579 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hv6p5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-pnrgn_openstack(20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:19:48 crc kubenswrapper[4708]: E0320 16:19:48.681832 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" podUID="20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7" Mar 20 16:19:49 crc kubenswrapper[4708]: E0320 16:19:49.427712 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" podUID="295d15f2-fb02-4b05-a59a-40f8e5c2a60e" Mar 20 16:19:49 crc kubenswrapper[4708]: E0320 16:19:49.428909 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" podUID="05ccae0c-99a6-4534-865d-65b3a96a5832" Mar 20 16:19:50 crc kubenswrapper[4708]: I0320 16:19:50.969871 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" Mar 20 16:19:50 crc kubenswrapper[4708]: I0320 16:19:50.974991 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.052157 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfkr9\" (UniqueName: \"kubernetes.io/projected/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-kube-api-access-rfkr9\") pod \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\" (UID: \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\") " Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.052555 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-config\") pod \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\" (UID: \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\") " Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.052596 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv6p5\" (UniqueName: \"kubernetes.io/projected/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7-kube-api-access-hv6p5\") pod \"20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7\" (UID: \"20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7\") " Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.052643 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-dns-svc\") pod \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\" (UID: \"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a\") " Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.052719 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7-config\") pod \"20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7\" (UID: \"20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7\") " Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.053748 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7-config" (OuterVolumeSpecName: "config") pod "20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7" (UID: "20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.054171 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d3c07e9-bc34-44ff-8eee-2777b4cebc9a" (UID: "9d3c07e9-bc34-44ff-8eee-2777b4cebc9a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.054555 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-config" (OuterVolumeSpecName: "config") pod "9d3c07e9-bc34-44ff-8eee-2777b4cebc9a" (UID: "9d3c07e9-bc34-44ff-8eee-2777b4cebc9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.081068 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-kube-api-access-rfkr9" (OuterVolumeSpecName: "kube-api-access-rfkr9") pod "9d3c07e9-bc34-44ff-8eee-2777b4cebc9a" (UID: "9d3c07e9-bc34-44ff-8eee-2777b4cebc9a"). InnerVolumeSpecName "kube-api-access-rfkr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.097261 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7-kube-api-access-hv6p5" (OuterVolumeSpecName: "kube-api-access-hv6p5") pod "20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7" (UID: "20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7"). InnerVolumeSpecName "kube-api-access-hv6p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.154683 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.154724 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.154737 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfkr9\" (UniqueName: \"kubernetes.io/projected/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-kube-api-access-rfkr9\") on node \"crc\" DevicePath \"\"" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.154753 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.154765 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv6p5\" (UniqueName: \"kubernetes.io/projected/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7-kube-api-access-hv6p5\") on node \"crc\" DevicePath \"\"" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.434460 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.439429 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.439481 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-pnrgn" event={"ID":"20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7","Type":"ContainerDied","Data":"f6c9a012662ac3626e242d8e955b0b58ac705e04276146a214f2d58b24b2f5d1"} Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.441507 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" event={"ID":"9d3c07e9-bc34-44ff-8eee-2777b4cebc9a","Type":"ContainerDied","Data":"d45d856fe711974ae46a332d8f4ef84ca80ce59b5f767d0cbf18fa6179b46a74"} Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.441566 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vc8cm" Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.511855 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vc8cm"] Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.529794 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vc8cm"] Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.561582 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pnrgn"] Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.576211 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pnrgn"] Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.581490 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.586751 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 16:19:51 crc kubenswrapper[4708]: I0320 16:19:51.591539 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 16:19:51 crc kubenswrapper[4708]: W0320 16:19:51.904271 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod301fee8b_2c0b_46c6_810c_a89d85b4efb4.slice/crio-643fc60b44a7b26cfda80e35eb9dddcf0292fa1fbb852a49eaeac6289e248613 WatchSource:0}: Error finding container 643fc60b44a7b26cfda80e35eb9dddcf0292fa1fbb852a49eaeac6289e248613: Status 404 returned error can't find the container with id 643fc60b44a7b26cfda80e35eb9dddcf0292fa1fbb852a49eaeac6289e248613 Mar 20 16:19:51 crc kubenswrapper[4708]: W0320 16:19:51.906013 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd80d56a8_3037_4d99_afd9_61aeecc4259c.slice/crio-ae43f8780128a2b462469721b45161d99656b23a045498faa7205bf77fad40f1 WatchSource:0}: Error finding container ae43f8780128a2b462469721b45161d99656b23a045498faa7205bf77fad40f1: Status 404 returned error can't find the container with id ae43f8780128a2b462469721b45161d99656b23a045498faa7205bf77fad40f1 Mar 20 16:19:51 crc kubenswrapper[4708]: W0320 16:19:51.911953 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod052c07b4_fc8c_45df_9294_d6217de2f52c.slice/crio-ebe14acd8be2be866ff96c67d16a4edc2f251bbbf60ac1c45480729ab0d49d79 WatchSource:0}: Error finding container ebe14acd8be2be866ff96c67d16a4edc2f251bbbf60ac1c45480729ab0d49d79: Status 404 returned error can't find the container with id ebe14acd8be2be866ff96c67d16a4edc2f251bbbf60ac1c45480729ab0d49d79 Mar 20 16:19:51 crc kubenswrapper[4708]: W0320 16:19:51.914903 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e85ea9_8f00_458b_9016_ef5c4b9569f7.slice/crio-11d08bb6c4b8d113afc9509cee6b14d82bb983a1858fc8af835289bb9207d11a WatchSource:0}: Error finding container 11d08bb6c4b8d113afc9509cee6b14d82bb983a1858fc8af835289bb9207d11a: Status 404 returned error can't find the container with id 11d08bb6c4b8d113afc9509cee6b14d82bb983a1858fc8af835289bb9207d11a Mar 20 16:19:52 crc kubenswrapper[4708]: I0320 16:19:52.121752 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7" path="/var/lib/kubelet/pods/20ea7cac-10d2-47f5-8ecd-bef3d4ef8bc7/volumes" Mar 20 16:19:52 crc kubenswrapper[4708]: I0320 16:19:52.122199 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d3c07e9-bc34-44ff-8eee-2777b4cebc9a" path="/var/lib/kubelet/pods/9d3c07e9-bc34-44ff-8eee-2777b4cebc9a/volumes" Mar 20 16:19:52 crc kubenswrapper[4708]: I0320 16:19:52.416696 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 16:19:52 crc kubenswrapper[4708]: I0320 16:19:52.448384 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-22qr2"] Mar 20 16:19:52 crc kubenswrapper[4708]: I0320 16:19:52.452554 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d80d56a8-3037-4d99-afd9-61aeecc4259c","Type":"ContainerStarted","Data":"ae43f8780128a2b462469721b45161d99656b23a045498faa7205bf77fad40f1"} Mar 20 16:19:52 crc kubenswrapper[4708]: I0320 16:19:52.453388 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"301fee8b-2c0b-46c6-810c-a89d85b4efb4","Type":"ContainerStarted","Data":"643fc60b44a7b26cfda80e35eb9dddcf0292fa1fbb852a49eaeac6289e248613"} Mar 20 16:19:52 crc kubenswrapper[4708]: I0320 16:19:52.454258 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98e85ea9-8f00-458b-9016-ef5c4b9569f7","Type":"ContainerStarted","Data":"11d08bb6c4b8d113afc9509cee6b14d82bb983a1858fc8af835289bb9207d11a"} Mar 20 16:19:52 crc kubenswrapper[4708]: I0320 16:19:52.456959 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"052c07b4-fc8c-45df-9294-d6217de2f52c","Type":"ContainerStarted","Data":"ebe14acd8be2be866ff96c67d16a4edc2f251bbbf60ac1c45480729ab0d49d79"} Mar 20 16:19:52 crc kubenswrapper[4708]: I0320 16:19:52.515808 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-w47cd"] Mar 20 16:19:52 crc kubenswrapper[4708]: I0320 16:19:52.602392 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 16:19:53 crc kubenswrapper[4708]: I0320 16:19:53.468422 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-22qr2" event={"ID":"8ad8d5cb-c681-406d-8dee-25f0a0f71b83","Type":"ContainerStarted","Data":"74186d2a8aae5bd9458a1acefcd82422224cd75c0cceb45900dfd1edee24ac45"} Mar 20 16:19:53 crc kubenswrapper[4708]: I0320 16:19:53.471962 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w47cd" event={"ID":"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0","Type":"ContainerStarted","Data":"4c40757cddf8516b68ee4b5201626adba3744faf3f27ab5283076208ae3cbef2"} Mar 20 16:19:55 crc kubenswrapper[4708]: W0320 16:19:55.295304 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9db73214_be2f_4e2e_b703_8cc42aa0c86a.slice/crio-eb9987306802396beee6122cb72c7ce2c604d7cc7dfa60dae7628782d3b9d650 WatchSource:0}: Error finding container eb9987306802396beee6122cb72c7ce2c604d7cc7dfa60dae7628782d3b9d650: Status 404 returned error can't find the container with id eb9987306802396beee6122cb72c7ce2c604d7cc7dfa60dae7628782d3b9d650 Mar 20 16:19:55 crc kubenswrapper[4708]: I0320 16:19:55.487996 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bdb983a7-d139-4a8b-bbb5-6e65999c6be5","Type":"ContainerStarted","Data":"e2b9a7c18345743264c7fc6224385866c3c35334746c2e4ec2fe818ba2bfa243"} Mar 20 16:19:55 crc kubenswrapper[4708]: I0320 16:19:55.489145 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9db73214-be2f-4e2e-b703-8cc42aa0c86a","Type":"ContainerStarted","Data":"eb9987306802396beee6122cb72c7ce2c604d7cc7dfa60dae7628782d3b9d650"} Mar 20 16:19:56 crc kubenswrapper[4708]: I0320 16:19:56.179149 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:19:56 crc kubenswrapper[4708]: I0320 16:19:56.179712 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:20:00 crc kubenswrapper[4708]: I0320 16:20:00.133572 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567060-smkpv"] Mar 20 16:20:00 crc kubenswrapper[4708]: I0320 16:20:00.136438 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-smkpv" Mar 20 16:20:00 crc kubenswrapper[4708]: I0320 16:20:00.139902 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:20:00 crc kubenswrapper[4708]: I0320 16:20:00.139981 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:20:00 crc kubenswrapper[4708]: I0320 16:20:00.140281 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:20:00 crc kubenswrapper[4708]: I0320 16:20:00.163764 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-smkpv"] Mar 20 16:20:00 crc kubenswrapper[4708]: I0320 16:20:00.240932 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrjp6\" (UniqueName: \"kubernetes.io/projected/6e68e520-8128-47bf-9e19-20ee31ecbcad-kube-api-access-zrjp6\") pod \"auto-csr-approver-29567060-smkpv\" (UID: \"6e68e520-8128-47bf-9e19-20ee31ecbcad\") " pod="openshift-infra/auto-csr-approver-29567060-smkpv" Mar 20 16:20:00 crc kubenswrapper[4708]: I0320 16:20:00.343695 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrjp6\" (UniqueName: \"kubernetes.io/projected/6e68e520-8128-47bf-9e19-20ee31ecbcad-kube-api-access-zrjp6\") pod \"auto-csr-approver-29567060-smkpv\" (UID: \"6e68e520-8128-47bf-9e19-20ee31ecbcad\") " pod="openshift-infra/auto-csr-approver-29567060-smkpv" Mar 20 16:20:00 crc kubenswrapper[4708]: I0320 16:20:00.365868 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrjp6\" (UniqueName: \"kubernetes.io/projected/6e68e520-8128-47bf-9e19-20ee31ecbcad-kube-api-access-zrjp6\") pod \"auto-csr-approver-29567060-smkpv\" (UID: \"6e68e520-8128-47bf-9e19-20ee31ecbcad\") " pod="openshift-infra/auto-csr-approver-29567060-smkpv" Mar 20 16:20:00 crc kubenswrapper[4708]: I0320 16:20:00.461825 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-smkpv" Mar 20 16:20:01 crc kubenswrapper[4708]: I0320 16:20:01.120469 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-smkpv"] Mar 20 16:20:01 crc kubenswrapper[4708]: W0320 16:20:01.123860 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e68e520_8128_47bf_9e19_20ee31ecbcad.slice/crio-3f49d947ddeb193bce5fd8a86c5690b62c63cbd0f1ae949e8e31ebbba39a2c91 WatchSource:0}: Error finding container 3f49d947ddeb193bce5fd8a86c5690b62c63cbd0f1ae949e8e31ebbba39a2c91: Status 404 returned error can't find the container with id 3f49d947ddeb193bce5fd8a86c5690b62c63cbd0f1ae949e8e31ebbba39a2c91 Mar 20 16:20:01 crc kubenswrapper[4708]: I0320 16:20:01.539623 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98e85ea9-8f00-458b-9016-ef5c4b9569f7","Type":"ContainerStarted","Data":"b7a66b7d91d3a181ed06cd62bbd4a0ce3c993f0e283c4020a0fd213c4db4e09f"} Mar 20 16:20:01 crc kubenswrapper[4708]: I0320 16:20:01.541236 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-smkpv" event={"ID":"6e68e520-8128-47bf-9e19-20ee31ecbcad","Type":"ContainerStarted","Data":"3f49d947ddeb193bce5fd8a86c5690b62c63cbd0f1ae949e8e31ebbba39a2c91"} Mar 20 16:20:01 crc kubenswrapper[4708]: I0320 16:20:01.542966 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1da957bf-1f80-4bef-9033-333fa60118c3","Type":"ContainerStarted","Data":"c301f7b2aab06d90d1bd62af27049a7e6e3f39f196ab300d7951a8d4dc7d3364"} Mar 20 16:20:02 crc kubenswrapper[4708]: I0320 16:20:02.551105 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"103fe6f4-2ac5-430b-9ce4-2d142b273674","Type":"ContainerStarted","Data":"5d4a7dd1f5603e92bc8e2924bc34be2c14bea6aeb541bbec562edee75673265d"} Mar 20 16:20:04 crc kubenswrapper[4708]: I0320 16:20:04.113455 4708 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:20:13 crc kubenswrapper[4708]: E0320 16:20:13.419243 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Mar 20 16:20:13 crc kubenswrapper[4708]: E0320 16:20:13.420030 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n5d8h66ch669hbbhbch574h76h65fh558hfch5d5h66fh87h659h5b5hbdh59dh66bhf9h546h66dhdhfdhfbh597hd8h5d8h679h65fh574h56dh5b7q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tlfxw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(d80d56a8-3037-4d99-afd9-61aeecc4259c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:20:13 crc kubenswrapper[4708]: E0320 16:20:13.421158 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="d80d56a8-3037-4d99-afd9-61aeecc4259c" Mar 20 16:20:13 crc kubenswrapper[4708]: E0320 16:20:13.646073 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="d80d56a8-3037-4d99-afd9-61aeecc4259c" Mar 20 16:20:14 crc kubenswrapper[4708]: E0320 16:20:14.137534 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Mar 20 16:20:14 crc kubenswrapper[4708]: E0320 16:20:14.137737 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbch4h64ch64ch54bh689hbhc7h55fh699hc7h678h585h8fh5f8h55ch544hb6h58h78hd8h578h665h64dh695h68bh559hf5h8dh5bbh6bhc7q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5l2sj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-22qr2_openstack(8ad8d5cb-c681-406d-8dee-25f0a0f71b83): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:20:14 crc kubenswrapper[4708]: E0320 16:20:14.138941 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-22qr2" podUID="8ad8d5cb-c681-406d-8dee-25f0a0f71b83" Mar 20 16:20:14 crc kubenswrapper[4708]: I0320 16:20:14.656008 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-smkpv" event={"ID":"6e68e520-8128-47bf-9e19-20ee31ecbcad","Type":"ContainerStarted","Data":"3bce7486a1dc6a240390efd4a52deb80f3738f57f67d941bea0386d021a24955"} Mar 20 16:20:14 crc kubenswrapper[4708]: E0320 16:20:14.659550 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-22qr2" podUID="8ad8d5cb-c681-406d-8dee-25f0a0f71b83" Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.664121 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"052c07b4-fc8c-45df-9294-d6217de2f52c","Type":"ContainerStarted","Data":"319ae184d92377004755ada8436cd76b5189780caafc8b5e221419c6cfbea0fe"} Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.666259 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"301fee8b-2c0b-46c6-810c-a89d85b4efb4","Type":"ContainerStarted","Data":"2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06"} Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.666584 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.667614 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bdb983a7-d139-4a8b-bbb5-6e65999c6be5","Type":"ContainerStarted","Data":"570e852c4dea1d58d96f53ca9edfaefabd44e72e7170580e80dea29d6a3fd30d"} Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.668876 4708 generic.go:334] "Generic (PLEG): container finished" podID="295d15f2-fb02-4b05-a59a-40f8e5c2a60e" containerID="4ca398ac1057da9b46e29319821da2a3c8de3ca6ac3b370244be218bb63f3a8a" exitCode=0 Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.668927 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" event={"ID":"295d15f2-fb02-4b05-a59a-40f8e5c2a60e","Type":"ContainerDied","Data":"4ca398ac1057da9b46e29319821da2a3c8de3ca6ac3b370244be218bb63f3a8a"} Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.670954 4708 generic.go:334] "Generic (PLEG): container finished" podID="6e68e520-8128-47bf-9e19-20ee31ecbcad" containerID="3bce7486a1dc6a240390efd4a52deb80f3738f57f67d941bea0386d021a24955" exitCode=0 Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.671048 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-smkpv" event={"ID":"6e68e520-8128-47bf-9e19-20ee31ecbcad","Type":"ContainerDied","Data":"3bce7486a1dc6a240390efd4a52deb80f3738f57f67d941bea0386d021a24955"} Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.672319 4708 generic.go:334] "Generic (PLEG): container finished" podID="05ccae0c-99a6-4534-865d-65b3a96a5832" containerID="8d85bed1ed3d705769826947b71fd6aa828ccdf4dc8c64ebed0a5a148898f938" exitCode=0 Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.672361 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" event={"ID":"05ccae0c-99a6-4534-865d-65b3a96a5832","Type":"ContainerDied","Data":"8d85bed1ed3d705769826947b71fd6aa828ccdf4dc8c64ebed0a5a148898f938"} Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.685362 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9db73214-be2f-4e2e-b703-8cc42aa0c86a","Type":"ContainerStarted","Data":"4cb42e891ffb9e4a0520e6fabf8eaec9c62e1a7f092009d0347f37c9fe190be6"} Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.689505 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w47cd" event={"ID":"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0","Type":"ContainerStarted","Data":"112e5af6128c8cfc5e1227cbdbcdfcccb4f232d506ae4ca87d973a16d2631524"} Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.704186 4708 generic.go:334] "Generic (PLEG): container finished" podID="98e85ea9-8f00-458b-9016-ef5c4b9569f7" containerID="b7a66b7d91d3a181ed06cd62bbd4a0ce3c993f0e283c4020a0fd213c4db4e09f" exitCode=0 Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.704358 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98e85ea9-8f00-458b-9016-ef5c4b9569f7","Type":"ContainerDied","Data":"b7a66b7d91d3a181ed06cd62bbd4a0ce3c993f0e283c4020a0fd213c4db4e09f"} Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.707447 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567060-smkpv" podStartSLOduration=3.445410089 podStartE2EDuration="15.707418928s" podCreationTimestamp="2026-03-20 16:20:00 +0000 UTC" firstStartedPulling="2026-03-20 16:20:01.125572351 +0000 UTC m=+1155.799909096" lastFinishedPulling="2026-03-20 16:20:13.38758122 +0000 UTC m=+1168.061917935" observedRunningTime="2026-03-20 16:20:14.711029231 +0000 UTC m=+1169.385365946" watchObservedRunningTime="2026-03-20 16:20:15.707418928 +0000 UTC m=+1170.381755653" Mar 20 16:20:15 crc kubenswrapper[4708]: I0320 16:20:15.815825 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=19.62905482 podStartE2EDuration="42.815806446s" podCreationTimestamp="2026-03-20 16:19:33 +0000 UTC" firstStartedPulling="2026-03-20 16:19:51.916714318 +0000 UTC m=+1146.591051023" lastFinishedPulling="2026-03-20 16:20:15.103465934 +0000 UTC m=+1169.777802649" observedRunningTime="2026-03-20 16:20:15.810234023 +0000 UTC m=+1170.484570758" watchObservedRunningTime="2026-03-20 16:20:15.815806446 +0000 UTC m=+1170.490143501" Mar 20 16:20:16 crc kubenswrapper[4708]: I0320 16:20:16.720939 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" event={"ID":"295d15f2-fb02-4b05-a59a-40f8e5c2a60e","Type":"ContainerStarted","Data":"2e87173dc3cefc7073ca619cc37c9444102d3511442c5e7b3da9f9ff1fe83ca1"} Mar 20 16:20:16 crc kubenswrapper[4708]: I0320 16:20:16.721543 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:20:16 crc kubenswrapper[4708]: I0320 16:20:16.724658 4708 generic.go:334] "Generic (PLEG): container finished" podID="c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0" containerID="112e5af6128c8cfc5e1227cbdbcdfcccb4f232d506ae4ca87d973a16d2631524" exitCode=0 Mar 20 16:20:16 crc kubenswrapper[4708]: I0320 16:20:16.724713 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w47cd" event={"ID":"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0","Type":"ContainerDied","Data":"112e5af6128c8cfc5e1227cbdbcdfcccb4f232d506ae4ca87d973a16d2631524"} Mar 20 16:20:16 crc kubenswrapper[4708]: I0320 16:20:16.730785 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98e85ea9-8f00-458b-9016-ef5c4b9569f7","Type":"ContainerStarted","Data":"d8ef98fe7e43123a49216c9bfba90dee18fa6b48a99f53a45772bfe08a3aac63"} Mar 20 16:20:16 crc kubenswrapper[4708]: I0320 16:20:16.734031 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" event={"ID":"05ccae0c-99a6-4534-865d-65b3a96a5832","Type":"ContainerStarted","Data":"afcd536290faeb13b8091d95456e7740f41957166d2c6478a70e8ed264273469"} Mar 20 16:20:16 crc kubenswrapper[4708]: I0320 16:20:16.744761 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" podStartSLOduration=3.40909812 podStartE2EDuration="49.744736526s" podCreationTimestamp="2026-03-20 16:19:27 +0000 UTC" firstStartedPulling="2026-03-20 16:19:28.673085834 +0000 UTC m=+1123.347422549" lastFinishedPulling="2026-03-20 16:20:15.00872423 +0000 UTC m=+1169.683060955" observedRunningTime="2026-03-20 16:20:16.739601946 +0000 UTC m=+1171.413938671" watchObservedRunningTime="2026-03-20 16:20:16.744736526 +0000 UTC m=+1171.419073241" Mar 20 16:20:16 crc kubenswrapper[4708]: I0320 16:20:16.777892 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=38.005613053 podStartE2EDuration="46.777865503s" podCreationTimestamp="2026-03-20 16:19:30 +0000 UTC" firstStartedPulling="2026-03-20 16:19:51.919823513 +0000 UTC m=+1146.594160228" lastFinishedPulling="2026-03-20 16:20:00.692075963 +0000 UTC m=+1155.366412678" observedRunningTime="2026-03-20 16:20:16.769096653 +0000 UTC m=+1171.443433368" watchObservedRunningTime="2026-03-20 16:20:16.777865503 +0000 UTC m=+1171.452202238" Mar 20 16:20:16 crc kubenswrapper[4708]: I0320 16:20:16.791918 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" podStartSLOduration=3.699920472 podStartE2EDuration="49.791880627s" podCreationTimestamp="2026-03-20 16:19:27 +0000 UTC" firstStartedPulling="2026-03-20 16:19:28.35221469 +0000 UTC m=+1123.026551405" lastFinishedPulling="2026-03-20 16:20:14.444174845 +0000 UTC m=+1169.118511560" observedRunningTime="2026-03-20 16:20:16.787501987 +0000 UTC m=+1171.461838702" watchObservedRunningTime="2026-03-20 16:20:16.791880627 +0000 UTC m=+1171.466217342" Mar 20 16:20:17 crc kubenswrapper[4708]: I0320 16:20:17.054918 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-smkpv" Mar 20 16:20:17 crc kubenswrapper[4708]: I0320 16:20:17.125066 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrjp6\" (UniqueName: \"kubernetes.io/projected/6e68e520-8128-47bf-9e19-20ee31ecbcad-kube-api-access-zrjp6\") pod \"6e68e520-8128-47bf-9e19-20ee31ecbcad\" (UID: \"6e68e520-8128-47bf-9e19-20ee31ecbcad\") " Mar 20 16:20:17 crc kubenswrapper[4708]: I0320 16:20:17.151466 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e68e520-8128-47bf-9e19-20ee31ecbcad-kube-api-access-zrjp6" (OuterVolumeSpecName: "kube-api-access-zrjp6") pod "6e68e520-8128-47bf-9e19-20ee31ecbcad" (UID: "6e68e520-8128-47bf-9e19-20ee31ecbcad"). InnerVolumeSpecName "kube-api-access-zrjp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:17 crc kubenswrapper[4708]: I0320 16:20:17.228296 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrjp6\" (UniqueName: \"kubernetes.io/projected/6e68e520-8128-47bf-9e19-20ee31ecbcad-kube-api-access-zrjp6\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:17 crc kubenswrapper[4708]: I0320 16:20:17.629521 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:20:17 crc kubenswrapper[4708]: I0320 16:20:17.750523 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-smkpv" event={"ID":"6e68e520-8128-47bf-9e19-20ee31ecbcad","Type":"ContainerDied","Data":"3f49d947ddeb193bce5fd8a86c5690b62c63cbd0f1ae949e8e31ebbba39a2c91"} Mar 20 16:20:17 crc kubenswrapper[4708]: I0320 16:20:17.750567 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f49d947ddeb193bce5fd8a86c5690b62c63cbd0f1ae949e8e31ebbba39a2c91" Mar 20 16:20:17 crc kubenswrapper[4708]: I0320 16:20:17.750638 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-smkpv" Mar 20 16:20:17 crc kubenswrapper[4708]: I0320 16:20:17.755369 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w47cd" event={"ID":"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0","Type":"ContainerStarted","Data":"d2d3877fb70a6a74b90ac5b91cfba1ea1745bde5fb9e674e45c43ebabcf78bb0"} Mar 20 16:20:17 crc kubenswrapper[4708]: I0320 16:20:17.777002 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-5hj5k"] Mar 20 16:20:17 crc kubenswrapper[4708]: I0320 16:20:17.784445 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-5hj5k"] Mar 20 16:20:18 crc kubenswrapper[4708]: I0320 16:20:18.121620 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="160992b9-f53d-47e7-9b7a-1436f5f815e2" path="/var/lib/kubelet/pods/160992b9-f53d-47e7-9b7a-1436f5f815e2/volumes" Mar 20 16:20:19 crc kubenswrapper[4708]: I0320 16:20:19.776263 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bdb983a7-d139-4a8b-bbb5-6e65999c6be5","Type":"ContainerStarted","Data":"db9b95bd6676041454d61b112a4e658e1aa3d3bee6569761218a8c5c5e1cdecd"} Mar 20 16:20:19 crc kubenswrapper[4708]: I0320 16:20:19.778445 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9db73214-be2f-4e2e-b703-8cc42aa0c86a","Type":"ContainerStarted","Data":"507f26449cf991728d743e97cbf4d4f78455589d36c716dff35968eb7ac3be12"} Mar 20 16:20:19 crc kubenswrapper[4708]: I0320 16:20:19.780293 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w47cd" event={"ID":"c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0","Type":"ContainerStarted","Data":"5284a3fc7ffed3e8d93cd0a4db89eb6b008df0d7a92b1b38649037633d42e112"} Mar 20 16:20:19 crc kubenswrapper[4708]: I0320 16:20:19.781096 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:20:19 crc kubenswrapper[4708]: I0320 16:20:19.781137 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:20:19 crc kubenswrapper[4708]: I0320 16:20:19.782555 4708 generic.go:334] "Generic (PLEG): container finished" podID="052c07b4-fc8c-45df-9294-d6217de2f52c" containerID="319ae184d92377004755ada8436cd76b5189780caafc8b5e221419c6cfbea0fe" exitCode=0 Mar 20 16:20:19 crc kubenswrapper[4708]: I0320 16:20:19.782598 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"052c07b4-fc8c-45df-9294-d6217de2f52c","Type":"ContainerDied","Data":"319ae184d92377004755ada8436cd76b5189780caafc8b5e221419c6cfbea0fe"} Mar 20 16:20:19 crc kubenswrapper[4708]: I0320 16:20:19.824932 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:19 crc kubenswrapper[4708]: I0320 16:20:19.834262 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=15.962261091 podStartE2EDuration="39.834246925s" podCreationTimestamp="2026-03-20 16:19:40 +0000 UTC" firstStartedPulling="2026-03-20 16:19:55.303494526 +0000 UTC m=+1149.977831241" lastFinishedPulling="2026-03-20 16:20:19.17548036 +0000 UTC m=+1173.849817075" observedRunningTime="2026-03-20 16:20:19.803305868 +0000 UTC m=+1174.477642593" watchObservedRunningTime="2026-03-20 16:20:19.834246925 +0000 UTC m=+1174.508583640" Mar 20 16:20:19 crc kubenswrapper[4708]: I0320 16:20:19.854418 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-w47cd" podStartSLOduration=22.353997043 podStartE2EDuration="42.854397167s" podCreationTimestamp="2026-03-20 16:19:37 +0000 UTC" firstStartedPulling="2026-03-20 16:19:53.17628517 +0000 UTC m=+1147.850621885" lastFinishedPulling="2026-03-20 16:20:13.676685294 +0000 UTC m=+1168.351022009" observedRunningTime="2026-03-20 16:20:19.852074264 +0000 UTC m=+1174.526410979" watchObservedRunningTime="2026-03-20 16:20:19.854397167 +0000 UTC m=+1174.528733872" Mar 20 16:20:19 crc kubenswrapper[4708]: I0320 16:20:19.871166 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:19 crc kubenswrapper[4708]: I0320 16:20:19.884714 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=20.017105801 podStartE2EDuration="43.884646295s" podCreationTimestamp="2026-03-20 16:19:36 +0000 UTC" firstStartedPulling="2026-03-20 16:19:55.304542834 +0000 UTC m=+1149.978879549" lastFinishedPulling="2026-03-20 16:20:19.172083328 +0000 UTC m=+1173.846420043" observedRunningTime="2026-03-20 16:20:19.87937637 +0000 UTC m=+1174.553713085" watchObservedRunningTime="2026-03-20 16:20:19.884646295 +0000 UTC m=+1174.558983010" Mar 20 16:20:20 crc kubenswrapper[4708]: I0320 16:20:20.784864 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:20 crc kubenswrapper[4708]: I0320 16:20:20.795899 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"052c07b4-fc8c-45df-9294-d6217de2f52c","Type":"ContainerStarted","Data":"2107bf1f577d823461080616ddb7be07f0873f5cae1ccc780d00d905ff0c6534"} Mar 20 16:20:20 crc kubenswrapper[4708]: I0320 16:20:20.796976 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:20 crc kubenswrapper[4708]: I0320 16:20:20.826740 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=30.35423585 podStartE2EDuration="51.826718466s" podCreationTimestamp="2026-03-20 16:19:29 +0000 UTC" firstStartedPulling="2026-03-20 16:19:51.916506643 +0000 UTC m=+1146.590843358" lastFinishedPulling="2026-03-20 16:20:13.388989259 +0000 UTC m=+1168.063325974" observedRunningTime="2026-03-20 16:20:20.824853944 +0000 UTC m=+1175.499190659" watchObservedRunningTime="2026-03-20 16:20:20.826718466 +0000 UTC m=+1175.501055181" Mar 20 16:20:20 crc kubenswrapper[4708]: I0320 16:20:20.834064 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:20 crc kubenswrapper[4708]: I0320 16:20:20.856156 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.155043 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-5j22k"] Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.155323 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" podUID="295d15f2-fb02-4b05-a59a-40f8e5c2a60e" containerName="dnsmasq-dns" containerID="cri-o://2e87173dc3cefc7073ca619cc37c9444102d3511442c5e7b3da9f9ff1fe83ca1" gracePeriod=10 Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.156820 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.203275 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-b2jct"] Mar 20 16:20:21 crc kubenswrapper[4708]: E0320 16:20:21.203748 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e68e520-8128-47bf-9e19-20ee31ecbcad" containerName="oc" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.203775 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e68e520-8128-47bf-9e19-20ee31ecbcad" containerName="oc" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.203986 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e68e520-8128-47bf-9e19-20ee31ecbcad" containerName="oc" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.204890 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.207268 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.231613 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-b2jct"] Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.761368 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.761627 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-b2jct\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.764343 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-config\") pod \"dnsmasq-dns-7fd796d7df-b2jct\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.764481 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-b2jct\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.764801 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5g65\" (UniqueName: \"kubernetes.io/projected/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-kube-api-access-f5g65\") pod \"dnsmasq-dns-7fd796d7df-b2jct\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.765003 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.793985 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.847021 4708 generic.go:334] "Generic (PLEG): container finished" podID="295d15f2-fb02-4b05-a59a-40f8e5c2a60e" containerID="2e87173dc3cefc7073ca619cc37c9444102d3511442c5e7b3da9f9ff1fe83ca1" exitCode=0 Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.848157 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" event={"ID":"295d15f2-fb02-4b05-a59a-40f8e5c2a60e","Type":"ContainerDied","Data":"2e87173dc3cefc7073ca619cc37c9444102d3511442c5e7b3da9f9ff1fe83ca1"} Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.873731 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5g65\" (UniqueName: \"kubernetes.io/projected/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-kube-api-access-f5g65\") pod \"dnsmasq-dns-7fd796d7df-b2jct\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.873800 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-b2jct\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.873949 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-config\") pod \"dnsmasq-dns-7fd796d7df-b2jct\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.874024 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-b2jct\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.875088 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-b2jct\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.876639 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-config\") pod \"dnsmasq-dns-7fd796d7df-b2jct\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.884541 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-b2jct\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.891464 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-42bhb"] Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.895589 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.898856 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.900452 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.905440 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-42bhb"] Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.940849 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5g65\" (UniqueName: \"kubernetes.io/projected/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-kube-api-access-f5g65\") pod \"dnsmasq-dns-7fd796d7df-b2jct\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.992051 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/459658ac-d380-46ac-9cec-377e902eba9c-ovs-rundir\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.992110 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r95q\" (UniqueName: \"kubernetes.io/projected/459658ac-d380-46ac-9cec-377e902eba9c-kube-api-access-7r95q\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.992216 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/459658ac-d380-46ac-9cec-377e902eba9c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.992286 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/459658ac-d380-46ac-9cec-377e902eba9c-ovn-rundir\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.992327 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459658ac-d380-46ac-9cec-377e902eba9c-combined-ca-bundle\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:21 crc kubenswrapper[4708]: I0320 16:20:21.992472 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/459658ac-d380-46ac-9cec-377e902eba9c-config\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.094630 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/459658ac-d380-46ac-9cec-377e902eba9c-ovn-rundir\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.095015 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459658ac-d380-46ac-9cec-377e902eba9c-combined-ca-bundle\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.095086 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/459658ac-d380-46ac-9cec-377e902eba9c-config\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.095132 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/459658ac-d380-46ac-9cec-377e902eba9c-ovs-rundir\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.095150 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r95q\" (UniqueName: \"kubernetes.io/projected/459658ac-d380-46ac-9cec-377e902eba9c-kube-api-access-7r95q\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.095182 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/459658ac-d380-46ac-9cec-377e902eba9c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.095168 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/459658ac-d380-46ac-9cec-377e902eba9c-ovn-rundir\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.095924 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/459658ac-d380-46ac-9cec-377e902eba9c-ovs-rundir\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.096226 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/459658ac-d380-46ac-9cec-377e902eba9c-config\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.103528 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/459658ac-d380-46ac-9cec-377e902eba9c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.114468 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459658ac-d380-46ac-9cec-377e902eba9c-combined-ca-bundle\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.125230 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r95q\" (UniqueName: \"kubernetes.io/projected/459658ac-d380-46ac-9cec-377e902eba9c-kube-api-access-7r95q\") pod \"ovn-controller-metrics-42bhb\" (UID: \"459658ac-d380-46ac-9cec-377e902eba9c\") " pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.144262 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.226921 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-42bhb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.333959 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.402982 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl9mk\" (UniqueName: \"kubernetes.io/projected/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-kube-api-access-xl9mk\") pod \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\" (UID: \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\") " Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.403239 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-config\") pod \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\" (UID: \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\") " Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.403279 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-dns-svc\") pod \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\" (UID: \"295d15f2-fb02-4b05-a59a-40f8e5c2a60e\") " Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.427040 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-kube-api-access-xl9mk" (OuterVolumeSpecName: "kube-api-access-xl9mk") pod "295d15f2-fb02-4b05-a59a-40f8e5c2a60e" (UID: "295d15f2-fb02-4b05-a59a-40f8e5c2a60e"). InnerVolumeSpecName "kube-api-access-xl9mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.450021 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-fzv8c"] Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.450370 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" podUID="05ccae0c-99a6-4534-865d-65b3a96a5832" containerName="dnsmasq-dns" containerID="cri-o://afcd536290faeb13b8091d95456e7740f41957166d2c6478a70e8ed264273469" gracePeriod=10 Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.453585 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.476828 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-z5mdd"] Mar 20 16:20:22 crc kubenswrapper[4708]: E0320 16:20:22.477419 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295d15f2-fb02-4b05-a59a-40f8e5c2a60e" containerName="init" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.477437 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="295d15f2-fb02-4b05-a59a-40f8e5c2a60e" containerName="init" Mar 20 16:20:22 crc kubenswrapper[4708]: E0320 16:20:22.477454 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295d15f2-fb02-4b05-a59a-40f8e5c2a60e" containerName="dnsmasq-dns" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.477462 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="295d15f2-fb02-4b05-a59a-40f8e5c2a60e" containerName="dnsmasq-dns" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.477643 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="295d15f2-fb02-4b05-a59a-40f8e5c2a60e" containerName="dnsmasq-dns" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.506735 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.508571 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "295d15f2-fb02-4b05-a59a-40f8e5c2a60e" (UID: "295d15f2-fb02-4b05-a59a-40f8e5c2a60e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.544843 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.544876 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl9mk\" (UniqueName: \"kubernetes.io/projected/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-kube-api-access-xl9mk\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.551810 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-config" (OuterVolumeSpecName: "config") pod "295d15f2-fb02-4b05-a59a-40f8e5c2a60e" (UID: "295d15f2-fb02-4b05-a59a-40f8e5c2a60e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.554853 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.574082 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-z5mdd"] Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.629793 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" podUID="05ccae0c-99a6-4534-865d-65b3a96a5832" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.99:5353: connect: connection refused" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.633719 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.640324 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.642457 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.643982 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-k2sdh" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.644054 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.644335 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.644426 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.645959 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295d15f2-fb02-4b05-a59a-40f8e5c2a60e-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.748107 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cde40d5-06a2-425d-a9c2-a00b10cc3148-scripts\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.748174 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.748205 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cde40d5-06a2-425d-a9c2-a00b10cc3148-config\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.748241 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cde40d5-06a2-425d-a9c2-a00b10cc3148-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.748265 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cde40d5-06a2-425d-a9c2-a00b10cc3148-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.748401 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsqck\" (UniqueName: \"kubernetes.io/projected/5cde40d5-06a2-425d-a9c2-a00b10cc3148-kube-api-access-qsqck\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.748454 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.748484 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5cde40d5-06a2-425d-a9c2-a00b10cc3148-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.753473 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz6c2\" (UniqueName: \"kubernetes.io/projected/b37996e7-c038-4444-bbe4-df1f32b2b029-kube-api-access-sz6c2\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.753552 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-config\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.753585 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.753642 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cde40d5-06a2-425d-a9c2-a00b10cc3148-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.853651 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-42bhb"] Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.861134 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cde40d5-06a2-425d-a9c2-a00b10cc3148-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.861182 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cde40d5-06a2-425d-a9c2-a00b10cc3148-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.861247 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsqck\" (UniqueName: \"kubernetes.io/projected/5cde40d5-06a2-425d-a9c2-a00b10cc3148-kube-api-access-qsqck\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.861270 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.861293 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5cde40d5-06a2-425d-a9c2-a00b10cc3148-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.861335 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz6c2\" (UniqueName: \"kubernetes.io/projected/b37996e7-c038-4444-bbe4-df1f32b2b029-kube-api-access-sz6c2\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.861353 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-config\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.861371 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.861394 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cde40d5-06a2-425d-a9c2-a00b10cc3148-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.861418 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cde40d5-06a2-425d-a9c2-a00b10cc3148-scripts\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.861442 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.861459 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cde40d5-06a2-425d-a9c2-a00b10cc3148-config\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.862421 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cde40d5-06a2-425d-a9c2-a00b10cc3148-config\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.872726 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-config\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.873746 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.874033 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5cde40d5-06a2-425d-a9c2-a00b10cc3148-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.878192 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.883602 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cde40d5-06a2-425d-a9c2-a00b10cc3148-scripts\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.885986 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.918184 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cde40d5-06a2-425d-a9c2-a00b10cc3148-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.921931 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cde40d5-06a2-425d-a9c2-a00b10cc3148-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.922990 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cde40d5-06a2-425d-a9c2-a00b10cc3148-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:22 crc kubenswrapper[4708]: I0320 16:20:22.943150 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-42bhb" event={"ID":"459658ac-d380-46ac-9cec-377e902eba9c","Type":"ContainerStarted","Data":"559146bdbddf7f84db19609666ca21a5d0ffc680c162088f931c4d6006943fce"} Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.008408 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-b2jct"] Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.012288 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" event={"ID":"295d15f2-fb02-4b05-a59a-40f8e5c2a60e","Type":"ContainerDied","Data":"55b4d76f568aa4969b8ae339ff20c4a6b493ed21617e8cfcc1ca5b4dced349c9"} Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.012343 4708 scope.go:117] "RemoveContainer" containerID="2e87173dc3cefc7073ca619cc37c9444102d3511442c5e7b3da9f9ff1fe83ca1" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.012554 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-5j22k" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.015168 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz6c2\" (UniqueName: \"kubernetes.io/projected/b37996e7-c038-4444-bbe4-df1f32b2b029-kube-api-access-sz6c2\") pod \"dnsmasq-dns-86db49b7ff-z5mdd\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.017952 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsqck\" (UniqueName: \"kubernetes.io/projected/5cde40d5-06a2-425d-a9c2-a00b10cc3148-kube-api-access-qsqck\") pod \"ovn-northd-0\" (UID: \"5cde40d5-06a2-425d-a9c2-a00b10cc3148\") " pod="openstack/ovn-northd-0" Mar 20 16:20:23 crc kubenswrapper[4708]: W0320 16:20:23.033709 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fb7af17_adad_4cc2_a3f3_95a4f7b9d5c2.slice/crio-aa2abc0d8b017eb791aa086259f6413324b43b154921f8baf5741af497984df5 WatchSource:0}: Error finding container aa2abc0d8b017eb791aa086259f6413324b43b154921f8baf5741af497984df5: Status 404 returned error can't find the container with id aa2abc0d8b017eb791aa086259f6413324b43b154921f8baf5741af497984df5 Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.034987 4708 generic.go:334] "Generic (PLEG): container finished" podID="05ccae0c-99a6-4534-865d-65b3a96a5832" containerID="afcd536290faeb13b8091d95456e7740f41957166d2c6478a70e8ed264273469" exitCode=0 Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.035715 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" event={"ID":"05ccae0c-99a6-4534-865d-65b3a96a5832","Type":"ContainerDied","Data":"afcd536290faeb13b8091d95456e7740f41957166d2c6478a70e8ed264273469"} Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.090307 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.110175 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-5j22k"] Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.119791 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-5j22k"] Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.120899 4708 scope.go:117] "RemoveContainer" containerID="4ca398ac1057da9b46e29319821da2a3c8de3ca6ac3b370244be218bb63f3a8a" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.197369 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.211525 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.223983 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.268456 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.273335 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ccae0c-99a6-4534-865d-65b3a96a5832-config\") pod \"05ccae0c-99a6-4534-865d-65b3a96a5832\" (UID: \"05ccae0c-99a6-4534-865d-65b3a96a5832\") " Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.273403 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p5vd\" (UniqueName: \"kubernetes.io/projected/05ccae0c-99a6-4534-865d-65b3a96a5832-kube-api-access-6p5vd\") pod \"05ccae0c-99a6-4534-865d-65b3a96a5832\" (UID: \"05ccae0c-99a6-4534-865d-65b3a96a5832\") " Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.273504 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05ccae0c-99a6-4534-865d-65b3a96a5832-dns-svc\") pod \"05ccae0c-99a6-4534-865d-65b3a96a5832\" (UID: \"05ccae0c-99a6-4534-865d-65b3a96a5832\") " Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.285298 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05ccae0c-99a6-4534-865d-65b3a96a5832-kube-api-access-6p5vd" (OuterVolumeSpecName: "kube-api-access-6p5vd") pod "05ccae0c-99a6-4534-865d-65b3a96a5832" (UID: "05ccae0c-99a6-4534-865d-65b3a96a5832"). InnerVolumeSpecName "kube-api-access-6p5vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.356641 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05ccae0c-99a6-4534-865d-65b3a96a5832-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "05ccae0c-99a6-4534-865d-65b3a96a5832" (UID: "05ccae0c-99a6-4534-865d-65b3a96a5832"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.375664 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05ccae0c-99a6-4534-865d-65b3a96a5832-config" (OuterVolumeSpecName: "config") pod "05ccae0c-99a6-4534-865d-65b3a96a5832" (UID: "05ccae0c-99a6-4534-865d-65b3a96a5832"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.376215 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ccae0c-99a6-4534-865d-65b3a96a5832-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.376277 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p5vd\" (UniqueName: \"kubernetes.io/projected/05ccae0c-99a6-4534-865d-65b3a96a5832-kube-api-access-6p5vd\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.376290 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05ccae0c-99a6-4534-865d-65b3a96a5832-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.726662 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-z5mdd"] Mar 20 16:20:23 crc kubenswrapper[4708]: W0320 16:20:23.727613 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb37996e7_c038_4444_bbe4_df1f32b2b029.slice/crio-eab8842d016fd070bdb526a82a1fb4dcb62a6d92c322c60d2c63d0c2d066c971 WatchSource:0}: Error finding container eab8842d016fd070bdb526a82a1fb4dcb62a6d92c322c60d2c63d0c2d066c971: Status 404 returned error can't find the container with id eab8842d016fd070bdb526a82a1fb4dcb62a6d92c322c60d2c63d0c2d066c971 Mar 20 16:20:23 crc kubenswrapper[4708]: I0320 16:20:23.852409 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 16:20:23 crc kubenswrapper[4708]: W0320 16:20:23.878495 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cde40d5_06a2_425d_a9c2_a00b10cc3148.slice/crio-f997248b03f2e88b78f047f060c60128b2432c7d934050bae5ae72cf3fe047c8 WatchSource:0}: Error finding container f997248b03f2e88b78f047f060c60128b2432c7d934050bae5ae72cf3fe047c8: Status 404 returned error can't find the container with id f997248b03f2e88b78f047f060c60128b2432c7d934050bae5ae72cf3fe047c8 Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.046372 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-42bhb" event={"ID":"459658ac-d380-46ac-9cec-377e902eba9c","Type":"ContainerStarted","Data":"8b6ece13a53de44e3868e0d9ee5240a6b7c9da58f36689cb32bde83b58ffbdd5"} Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.053117 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" event={"ID":"b37996e7-c038-4444-bbe4-df1f32b2b029","Type":"ContainerStarted","Data":"eab8842d016fd070bdb526a82a1fb4dcb62a6d92c322c60d2c63d0c2d066c971"} Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.054255 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5cde40d5-06a2-425d-a9c2-a00b10cc3148","Type":"ContainerStarted","Data":"f997248b03f2e88b78f047f060c60128b2432c7d934050bae5ae72cf3fe047c8"} Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.057098 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" event={"ID":"05ccae0c-99a6-4534-865d-65b3a96a5832","Type":"ContainerDied","Data":"a6c8cbecf0f1265aaee62d641f0c1914f9eb20486d1f02ae92d66a996411f823"} Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.057177 4708 scope.go:117] "RemoveContainer" containerID="afcd536290faeb13b8091d95456e7740f41957166d2c6478a70e8ed264273469" Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.057231 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.071566 4708 generic.go:334] "Generic (PLEG): container finished" podID="4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" containerID="98775f120f1971e14fadf4bfad30684035cfefb356dbe98a0dc1f6db50275080" exitCode=0 Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.073660 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" event={"ID":"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2","Type":"ContainerDied","Data":"98775f120f1971e14fadf4bfad30684035cfefb356dbe98a0dc1f6db50275080"} Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.076976 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" event={"ID":"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2","Type":"ContainerStarted","Data":"aa2abc0d8b017eb791aa086259f6413324b43b154921f8baf5741af497984df5"} Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.135930 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-42bhb" podStartSLOduration=3.135904978 podStartE2EDuration="3.135904978s" podCreationTimestamp="2026-03-20 16:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:20:24.074406745 +0000 UTC m=+1178.748743470" watchObservedRunningTime="2026-03-20 16:20:24.135904978 +0000 UTC m=+1178.810241693" Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.148595 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295d15f2-fb02-4b05-a59a-40f8e5c2a60e" path="/var/lib/kubelet/pods/295d15f2-fb02-4b05-a59a-40f8e5c2a60e/volumes" Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.233811 4708 scope.go:117] "RemoveContainer" containerID="8d85bed1ed3d705769826947b71fd6aa828ccdf4dc8c64ebed0a5a148898f938" Mar 20 16:20:24 crc kubenswrapper[4708]: I0320 16:20:24.324797 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 16:20:25 crc kubenswrapper[4708]: I0320 16:20:25.109390 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" event={"ID":"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2","Type":"ContainerStarted","Data":"6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44"} Mar 20 16:20:25 crc kubenswrapper[4708]: I0320 16:20:25.109528 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:25 crc kubenswrapper[4708]: I0320 16:20:25.112585 4708 generic.go:334] "Generic (PLEG): container finished" podID="b37996e7-c038-4444-bbe4-df1f32b2b029" containerID="3a3e2de04ed18996385382077fefdb5cc50aec60910d715ea1ac23a7e18ad66e" exitCode=0 Mar 20 16:20:25 crc kubenswrapper[4708]: I0320 16:20:25.112745 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" event={"ID":"b37996e7-c038-4444-bbe4-df1f32b2b029","Type":"ContainerDied","Data":"3a3e2de04ed18996385382077fefdb5cc50aec60910d715ea1ac23a7e18ad66e"} Mar 20 16:20:25 crc kubenswrapper[4708]: I0320 16:20:25.138380 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" podStartSLOduration=4.138342862 podStartE2EDuration="4.138342862s" podCreationTimestamp="2026-03-20 16:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:20:25.132467801 +0000 UTC m=+1179.806804536" watchObservedRunningTime="2026-03-20 16:20:25.138342862 +0000 UTC m=+1179.812679577" Mar 20 16:20:26 crc kubenswrapper[4708]: I0320 16:20:26.124959 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5cde40d5-06a2-425d-a9c2-a00b10cc3148","Type":"ContainerStarted","Data":"8b3b8a989a41988698a80e021a26a2d1ef7152afc7c4061f61f8ca68fc5f5488"} Mar 20 16:20:26 crc kubenswrapper[4708]: I0320 16:20:26.125924 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 16:20:26 crc kubenswrapper[4708]: I0320 16:20:26.125943 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5cde40d5-06a2-425d-a9c2-a00b10cc3148","Type":"ContainerStarted","Data":"e89d386636bd5a5e6c41e086d94c558a5808755ec2cade0055343b9e10ed85aa"} Mar 20 16:20:26 crc kubenswrapper[4708]: I0320 16:20:26.127910 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" event={"ID":"b37996e7-c038-4444-bbe4-df1f32b2b029","Type":"ContainerStarted","Data":"bdff7d3822cd20ff89be1e41d6d9d9ade14fd0195e6ee91e3526461e0478479c"} Mar 20 16:20:26 crc kubenswrapper[4708]: I0320 16:20:26.128210 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:26 crc kubenswrapper[4708]: I0320 16:20:26.165590 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.823014 podStartE2EDuration="4.165570014s" podCreationTimestamp="2026-03-20 16:20:22 +0000 UTC" firstStartedPulling="2026-03-20 16:20:23.882101791 +0000 UTC m=+1178.556438506" lastFinishedPulling="2026-03-20 16:20:25.224657805 +0000 UTC m=+1179.898994520" observedRunningTime="2026-03-20 16:20:26.155155018 +0000 UTC m=+1180.829491733" watchObservedRunningTime="2026-03-20 16:20:26.165570014 +0000 UTC m=+1180.839906729" Mar 20 16:20:26 crc kubenswrapper[4708]: I0320 16:20:26.178332 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:20:26 crc kubenswrapper[4708]: I0320 16:20:26.178442 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:20:26 crc kubenswrapper[4708]: I0320 16:20:26.178939 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" podStartSLOduration=4.178918249 podStartE2EDuration="4.178918249s" podCreationTimestamp="2026-03-20 16:20:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:20:26.174154089 +0000 UTC m=+1180.848490804" watchObservedRunningTime="2026-03-20 16:20:26.178918249 +0000 UTC m=+1180.853254964" Mar 20 16:20:27 crc kubenswrapper[4708]: I0320 16:20:27.142752 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-22qr2" event={"ID":"8ad8d5cb-c681-406d-8dee-25f0a0f71b83","Type":"ContainerStarted","Data":"567b8628ea7f75e4d1a44ad0ae6ab0d2271fb21fb3230a752770cbe3587221c7"} Mar 20 16:20:27 crc kubenswrapper[4708]: I0320 16:20:27.144376 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-22qr2" Mar 20 16:20:27 crc kubenswrapper[4708]: I0320 16:20:27.171698 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-22qr2" podStartSLOduration=16.770041466 podStartE2EDuration="50.171643286s" podCreationTimestamp="2026-03-20 16:19:37 +0000 UTC" firstStartedPulling="2026-03-20 16:19:53.180822275 +0000 UTC m=+1147.855158990" lastFinishedPulling="2026-03-20 16:20:26.582424085 +0000 UTC m=+1181.256760810" observedRunningTime="2026-03-20 16:20:27.17104771 +0000 UTC m=+1181.845384425" watchObservedRunningTime="2026-03-20 16:20:27.171643286 +0000 UTC m=+1181.845980011" Mar 20 16:20:29 crc kubenswrapper[4708]: I0320 16:20:29.161580 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d80d56a8-3037-4d99-afd9-61aeecc4259c","Type":"ContainerStarted","Data":"2a1b5361d1f326c61c8cc8301f9f8c4f48b0e9644484bf3af93359ec3799ba86"} Mar 20 16:20:29 crc kubenswrapper[4708]: I0320 16:20:29.164975 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 16:20:29 crc kubenswrapper[4708]: I0320 16:20:29.187496 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.361879024 podStartE2EDuration="58.187471942s" podCreationTimestamp="2026-03-20 16:19:31 +0000 UTC" firstStartedPulling="2026-03-20 16:19:51.909006817 +0000 UTC m=+1146.583343532" lastFinishedPulling="2026-03-20 16:20:28.734599735 +0000 UTC m=+1183.408936450" observedRunningTime="2026-03-20 16:20:29.180908542 +0000 UTC m=+1183.855245257" watchObservedRunningTime="2026-03-20 16:20:29.187471942 +0000 UTC m=+1183.861808657" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.380558 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-k8vjm"] Mar 20 16:20:30 crc kubenswrapper[4708]: E0320 16:20:30.381233 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ccae0c-99a6-4534-865d-65b3a96a5832" containerName="init" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.381250 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ccae0c-99a6-4534-865d-65b3a96a5832" containerName="init" Mar 20 16:20:30 crc kubenswrapper[4708]: E0320 16:20:30.381319 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ccae0c-99a6-4534-865d-65b3a96a5832" containerName="dnsmasq-dns" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.381327 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ccae0c-99a6-4534-865d-65b3a96a5832" containerName="dnsmasq-dns" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.381488 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="05ccae0c-99a6-4534-865d-65b3a96a5832" containerName="dnsmasq-dns" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.382058 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k8vjm" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.385224 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.395886 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k8vjm"] Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.444322 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.444389 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.512453 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.535526 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c973f5-81e6-41d1-ad8c-7ab713e44a75-operator-scripts\") pod \"root-account-create-update-k8vjm\" (UID: \"a2c973f5-81e6-41d1-ad8c-7ab713e44a75\") " pod="openstack/root-account-create-update-k8vjm" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.535606 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wpnl\" (UniqueName: \"kubernetes.io/projected/a2c973f5-81e6-41d1-ad8c-7ab713e44a75-kube-api-access-8wpnl\") pod \"root-account-create-update-k8vjm\" (UID: \"a2c973f5-81e6-41d1-ad8c-7ab713e44a75\") " pod="openstack/root-account-create-update-k8vjm" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.637635 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c973f5-81e6-41d1-ad8c-7ab713e44a75-operator-scripts\") pod \"root-account-create-update-k8vjm\" (UID: \"a2c973f5-81e6-41d1-ad8c-7ab713e44a75\") " pod="openstack/root-account-create-update-k8vjm" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.637742 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wpnl\" (UniqueName: \"kubernetes.io/projected/a2c973f5-81e6-41d1-ad8c-7ab713e44a75-kube-api-access-8wpnl\") pod \"root-account-create-update-k8vjm\" (UID: \"a2c973f5-81e6-41d1-ad8c-7ab713e44a75\") " pod="openstack/root-account-create-update-k8vjm" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.638636 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c973f5-81e6-41d1-ad8c-7ab713e44a75-operator-scripts\") pod \"root-account-create-update-k8vjm\" (UID: \"a2c973f5-81e6-41d1-ad8c-7ab713e44a75\") " pod="openstack/root-account-create-update-k8vjm" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.665197 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wpnl\" (UniqueName: \"kubernetes.io/projected/a2c973f5-81e6-41d1-ad8c-7ab713e44a75-kube-api-access-8wpnl\") pod \"root-account-create-update-k8vjm\" (UID: \"a2c973f5-81e6-41d1-ad8c-7ab713e44a75\") " pod="openstack/root-account-create-update-k8vjm" Mar 20 16:20:30 crc kubenswrapper[4708]: I0320 16:20:30.701243 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k8vjm" Mar 20 16:20:31 crc kubenswrapper[4708]: I0320 16:20:31.151273 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k8vjm"] Mar 20 16:20:31 crc kubenswrapper[4708]: I0320 16:20:31.182044 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k8vjm" event={"ID":"a2c973f5-81e6-41d1-ad8c-7ab713e44a75","Type":"ContainerStarted","Data":"0bc9a7ee8d13c00ec72677debeab01dcac5c48f1dfae5fd97999d1ecdfe3ff80"} Mar 20 16:20:31 crc kubenswrapper[4708]: I0320 16:20:31.274917 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 16:20:32 crc kubenswrapper[4708]: I0320 16:20:32.146855 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:32 crc kubenswrapper[4708]: I0320 16:20:32.192597 4708 generic.go:334] "Generic (PLEG): container finished" podID="a2c973f5-81e6-41d1-ad8c-7ab713e44a75" containerID="ea3a32bd99e17c4a243bceeff47881485473daf06ed79d00bb8cf2e91c78cffa" exitCode=0 Mar 20 16:20:32 crc kubenswrapper[4708]: I0320 16:20:32.192689 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k8vjm" event={"ID":"a2c973f5-81e6-41d1-ad8c-7ab713e44a75","Type":"ContainerDied","Data":"ea3a32bd99e17c4a243bceeff47881485473daf06ed79d00bb8cf2e91c78cffa"} Mar 20 16:20:32 crc kubenswrapper[4708]: I0320 16:20:32.975051 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-p756b"] Mar 20 16:20:32 crc kubenswrapper[4708]: I0320 16:20:32.976460 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-p756b" Mar 20 16:20:32 crc kubenswrapper[4708]: I0320 16:20:32.990352 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-p756b"] Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.051229 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-930e-account-create-update-nzvsx"] Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.052985 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-930e-account-create-update-nzvsx" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.055678 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.063765 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-930e-account-create-update-nzvsx"] Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.079576 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50a31e27-283d-43b0-91b5-71c548d61e27-operator-scripts\") pod \"keystone-db-create-p756b\" (UID: \"50a31e27-283d-43b0-91b5-71c548d61e27\") " pod="openstack/keystone-db-create-p756b" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.079733 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w59h\" (UniqueName: \"kubernetes.io/projected/50a31e27-283d-43b0-91b5-71c548d61e27-kube-api-access-2w59h\") pod \"keystone-db-create-p756b\" (UID: \"50a31e27-283d-43b0-91b5-71c548d61e27\") " pod="openstack/keystone-db-create-p756b" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.179395 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-h7q7q"] Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.180572 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h7q7q" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.181792 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50a31e27-283d-43b0-91b5-71c548d61e27-operator-scripts\") pod \"keystone-db-create-p756b\" (UID: \"50a31e27-283d-43b0-91b5-71c548d61e27\") " pod="openstack/keystone-db-create-p756b" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.181863 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/799c71bd-9b76-4d2d-a4b9-6953e7ee2863-operator-scripts\") pod \"keystone-930e-account-create-update-nzvsx\" (UID: \"799c71bd-9b76-4d2d-a4b9-6953e7ee2863\") " pod="openstack/keystone-930e-account-create-update-nzvsx" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.183104 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w59h\" (UniqueName: \"kubernetes.io/projected/50a31e27-283d-43b0-91b5-71c548d61e27-kube-api-access-2w59h\") pod \"keystone-db-create-p756b\" (UID: \"50a31e27-283d-43b0-91b5-71c548d61e27\") " pod="openstack/keystone-db-create-p756b" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.183148 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjjdd\" (UniqueName: \"kubernetes.io/projected/799c71bd-9b76-4d2d-a4b9-6953e7ee2863-kube-api-access-wjjdd\") pod \"keystone-930e-account-create-update-nzvsx\" (UID: \"799c71bd-9b76-4d2d-a4b9-6953e7ee2863\") " pod="openstack/keystone-930e-account-create-update-nzvsx" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.183912 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50a31e27-283d-43b0-91b5-71c548d61e27-operator-scripts\") pod \"keystone-db-create-p756b\" (UID: \"50a31e27-283d-43b0-91b5-71c548d61e27\") " pod="openstack/keystone-db-create-p756b" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.191441 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-h7q7q"] Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.198837 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.203344 4708 generic.go:334] "Generic (PLEG): container finished" podID="1da957bf-1f80-4bef-9033-333fa60118c3" containerID="c301f7b2aab06d90d1bd62af27049a7e6e3f39f196ab300d7951a8d4dc7d3364" exitCode=0 Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.203630 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1da957bf-1f80-4bef-9033-333fa60118c3","Type":"ContainerDied","Data":"c301f7b2aab06d90d1bd62af27049a7e6e3f39f196ab300d7951a8d4dc7d3364"} Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.209329 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w59h\" (UniqueName: \"kubernetes.io/projected/50a31e27-283d-43b0-91b5-71c548d61e27-kube-api-access-2w59h\") pod \"keystone-db-create-p756b\" (UID: \"50a31e27-283d-43b0-91b5-71c548d61e27\") " pod="openstack/keystone-db-create-p756b" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.212047 4708 generic.go:334] "Generic (PLEG): container finished" podID="103fe6f4-2ac5-430b-9ce4-2d142b273674" containerID="5d4a7dd1f5603e92bc8e2924bc34be2c14bea6aeb541bbec562edee75673265d" exitCode=0 Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.212318 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"103fe6f4-2ac5-430b-9ce4-2d142b273674","Type":"ContainerDied","Data":"5d4a7dd1f5603e92bc8e2924bc34be2c14bea6aeb541bbec562edee75673265d"} Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.263731 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-b2jct"] Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.264874 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" podUID="4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" containerName="dnsmasq-dns" containerID="cri-o://6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44" gracePeriod=10 Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.286337 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c03712fd-33bf-454a-a7f7-f907e9b9c0ec-operator-scripts\") pod \"placement-db-create-h7q7q\" (UID: \"c03712fd-33bf-454a-a7f7-f907e9b9c0ec\") " pod="openstack/placement-db-create-h7q7q" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.286452 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjjdd\" (UniqueName: \"kubernetes.io/projected/799c71bd-9b76-4d2d-a4b9-6953e7ee2863-kube-api-access-wjjdd\") pod \"keystone-930e-account-create-update-nzvsx\" (UID: \"799c71bd-9b76-4d2d-a4b9-6953e7ee2863\") " pod="openstack/keystone-930e-account-create-update-nzvsx" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.286541 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/799c71bd-9b76-4d2d-a4b9-6953e7ee2863-operator-scripts\") pod \"keystone-930e-account-create-update-nzvsx\" (UID: \"799c71bd-9b76-4d2d-a4b9-6953e7ee2863\") " pod="openstack/keystone-930e-account-create-update-nzvsx" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.286578 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nvsf\" (UniqueName: \"kubernetes.io/projected/c03712fd-33bf-454a-a7f7-f907e9b9c0ec-kube-api-access-4nvsf\") pod \"placement-db-create-h7q7q\" (UID: \"c03712fd-33bf-454a-a7f7-f907e9b9c0ec\") " pod="openstack/placement-db-create-h7q7q" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.304890 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-p756b" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.317459 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/799c71bd-9b76-4d2d-a4b9-6953e7ee2863-operator-scripts\") pod \"keystone-930e-account-create-update-nzvsx\" (UID: \"799c71bd-9b76-4d2d-a4b9-6953e7ee2863\") " pod="openstack/keystone-930e-account-create-update-nzvsx" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.349516 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjjdd\" (UniqueName: \"kubernetes.io/projected/799c71bd-9b76-4d2d-a4b9-6953e7ee2863-kube-api-access-wjjdd\") pod \"keystone-930e-account-create-update-nzvsx\" (UID: \"799c71bd-9b76-4d2d-a4b9-6953e7ee2863\") " pod="openstack/keystone-930e-account-create-update-nzvsx" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.405732 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c03712fd-33bf-454a-a7f7-f907e9b9c0ec-operator-scripts\") pod \"placement-db-create-h7q7q\" (UID: \"c03712fd-33bf-454a-a7f7-f907e9b9c0ec\") " pod="openstack/placement-db-create-h7q7q" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.405981 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nvsf\" (UniqueName: \"kubernetes.io/projected/c03712fd-33bf-454a-a7f7-f907e9b9c0ec-kube-api-access-4nvsf\") pod \"placement-db-create-h7q7q\" (UID: \"c03712fd-33bf-454a-a7f7-f907e9b9c0ec\") " pod="openstack/placement-db-create-h7q7q" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.407891 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c03712fd-33bf-454a-a7f7-f907e9b9c0ec-operator-scripts\") pod \"placement-db-create-h7q7q\" (UID: \"c03712fd-33bf-454a-a7f7-f907e9b9c0ec\") " pod="openstack/placement-db-create-h7q7q" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.420565 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8c1b-account-create-update-r92gp"] Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.421848 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8c1b-account-create-update-r92gp" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.423875 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.424266 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8c1b-account-create-update-r92gp"] Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.451255 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nvsf\" (UniqueName: \"kubernetes.io/projected/c03712fd-33bf-454a-a7f7-f907e9b9c0ec-kube-api-access-4nvsf\") pod \"placement-db-create-h7q7q\" (UID: \"c03712fd-33bf-454a-a7f7-f907e9b9c0ec\") " pod="openstack/placement-db-create-h7q7q" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.507616 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmpd2\" (UniqueName: \"kubernetes.io/projected/e5e322f6-c3f7-4870-84ae-f27c1d4ba293-kube-api-access-pmpd2\") pod \"placement-8c1b-account-create-update-r92gp\" (UID: \"e5e322f6-c3f7-4870-84ae-f27c1d4ba293\") " pod="openstack/placement-8c1b-account-create-update-r92gp" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.508174 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5e322f6-c3f7-4870-84ae-f27c1d4ba293-operator-scripts\") pod \"placement-8c1b-account-create-update-r92gp\" (UID: \"e5e322f6-c3f7-4870-84ae-f27c1d4ba293\") " pod="openstack/placement-8c1b-account-create-update-r92gp" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.543787 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h7q7q" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.610250 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpd2\" (UniqueName: \"kubernetes.io/projected/e5e322f6-c3f7-4870-84ae-f27c1d4ba293-kube-api-access-pmpd2\") pod \"placement-8c1b-account-create-update-r92gp\" (UID: \"e5e322f6-c3f7-4870-84ae-f27c1d4ba293\") " pod="openstack/placement-8c1b-account-create-update-r92gp" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.610395 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5e322f6-c3f7-4870-84ae-f27c1d4ba293-operator-scripts\") pod \"placement-8c1b-account-create-update-r92gp\" (UID: \"e5e322f6-c3f7-4870-84ae-f27c1d4ba293\") " pod="openstack/placement-8c1b-account-create-update-r92gp" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.611140 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5e322f6-c3f7-4870-84ae-f27c1d4ba293-operator-scripts\") pod \"placement-8c1b-account-create-update-r92gp\" (UID: \"e5e322f6-c3f7-4870-84ae-f27c1d4ba293\") " pod="openstack/placement-8c1b-account-create-update-r92gp" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.637815 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmpd2\" (UniqueName: \"kubernetes.io/projected/e5e322f6-c3f7-4870-84ae-f27c1d4ba293-kube-api-access-pmpd2\") pod \"placement-8c1b-account-create-update-r92gp\" (UID: \"e5e322f6-c3f7-4870-84ae-f27c1d4ba293\") " pod="openstack/placement-8c1b-account-create-update-r92gp" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.669092 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-930e-account-create-update-nzvsx" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.776202 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k8vjm" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.875930 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8c1b-account-create-update-r92gp" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.918637 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c973f5-81e6-41d1-ad8c-7ab713e44a75-operator-scripts\") pod \"a2c973f5-81e6-41d1-ad8c-7ab713e44a75\" (UID: \"a2c973f5-81e6-41d1-ad8c-7ab713e44a75\") " Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.919394 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c973f5-81e6-41d1-ad8c-7ab713e44a75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2c973f5-81e6-41d1-ad8c-7ab713e44a75" (UID: "a2c973f5-81e6-41d1-ad8c-7ab713e44a75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.919565 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wpnl\" (UniqueName: \"kubernetes.io/projected/a2c973f5-81e6-41d1-ad8c-7ab713e44a75-kube-api-access-8wpnl\") pod \"a2c973f5-81e6-41d1-ad8c-7ab713e44a75\" (UID: \"a2c973f5-81e6-41d1-ad8c-7ab713e44a75\") " Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.919964 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c973f5-81e6-41d1-ad8c-7ab713e44a75-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.925040 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c973f5-81e6-41d1-ad8c-7ab713e44a75-kube-api-access-8wpnl" (OuterVolumeSpecName: "kube-api-access-8wpnl") pod "a2c973f5-81e6-41d1-ad8c-7ab713e44a75" (UID: "a2c973f5-81e6-41d1-ad8c-7ab713e44a75"). InnerVolumeSpecName "kube-api-access-8wpnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.933422 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-p756b"] Mar 20 16:20:33 crc kubenswrapper[4708]: W0320 16:20:33.934471 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50a31e27_283d_43b0_91b5_71c548d61e27.slice/crio-fee703f4c5bba3d9df964c52f8836f36d16b8c2caeeab07c785e28a856f6f582 WatchSource:0}: Error finding container fee703f4c5bba3d9df964c52f8836f36d16b8c2caeeab07c785e28a856f6f582: Status 404 returned error can't find the container with id fee703f4c5bba3d9df964c52f8836f36d16b8c2caeeab07c785e28a856f6f582 Mar 20 16:20:33 crc kubenswrapper[4708]: I0320 16:20:33.957395 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.023708 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wpnl\" (UniqueName: \"kubernetes.io/projected/a2c973f5-81e6-41d1-ad8c-7ab713e44a75-kube-api-access-8wpnl\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.126779 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5g65\" (UniqueName: \"kubernetes.io/projected/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-kube-api-access-f5g65\") pod \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.126826 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-ovsdbserver-nb\") pod \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.126879 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-config\") pod \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.126925 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-dns-svc\") pod \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\" (UID: \"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2\") " Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.150156 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-kube-api-access-f5g65" (OuterVolumeSpecName: "kube-api-access-f5g65") pod "4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" (UID: "4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2"). InnerVolumeSpecName "kube-api-access-f5g65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.172654 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" (UID: "4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.180719 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" (UID: "4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.203655 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-config" (OuterVolumeSpecName: "config") pod "4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" (UID: "4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.224033 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-p756b" event={"ID":"50a31e27-283d-43b0-91b5-71c548d61e27","Type":"ContainerStarted","Data":"8fbe54b40e5f5c97037f62b168a6ff3392879081a253eee9d30bac66ba93c773"} Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.224080 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-p756b" event={"ID":"50a31e27-283d-43b0-91b5-71c548d61e27","Type":"ContainerStarted","Data":"fee703f4c5bba3d9df964c52f8836f36d16b8c2caeeab07c785e28a856f6f582"} Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.225576 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k8vjm" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.225854 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k8vjm" event={"ID":"a2c973f5-81e6-41d1-ad8c-7ab713e44a75","Type":"ContainerDied","Data":"0bc9a7ee8d13c00ec72677debeab01dcac5c48f1dfae5fd97999d1ecdfe3ff80"} Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.225895 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc9a7ee8d13c00ec72677debeab01dcac5c48f1dfae5fd97999d1ecdfe3ff80" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.228985 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-h7q7q"] Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.230114 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5g65\" (UniqueName: \"kubernetes.io/projected/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-kube-api-access-f5g65\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.230138 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.230147 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.230156 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.231381 4708 generic.go:334] "Generic (PLEG): container finished" podID="4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" containerID="6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44" exitCode=0 Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.231430 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" event={"ID":"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2","Type":"ContainerDied","Data":"6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44"} Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.231453 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" event={"ID":"4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2","Type":"ContainerDied","Data":"aa2abc0d8b017eb791aa086259f6413324b43b154921f8baf5741af497984df5"} Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.231472 4708 scope.go:117] "RemoveContainer" containerID="6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.231500 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-b2jct" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.253189 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1da957bf-1f80-4bef-9033-333fa60118c3","Type":"ContainerStarted","Data":"6a41d4ef5c584c47a62b6472ca31c29a4b7a0d3bc6460ac01cbab3b2f136ebb5"} Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.254279 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.260485 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"103fe6f4-2ac5-430b-9ce4-2d142b273674","Type":"ContainerStarted","Data":"4ccecbbada6a99e337b3285d0b17846876e43a2933aef8e13eea2835ddf7b269"} Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.261355 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.287929 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.06745244 podStartE2EDuration="1m7.287905063s" podCreationTimestamp="2026-03-20 16:19:27 +0000 UTC" firstStartedPulling="2026-03-20 16:19:29.281107259 +0000 UTC m=+1123.955443974" lastFinishedPulling="2026-03-20 16:19:59.501559882 +0000 UTC m=+1154.175896597" observedRunningTime="2026-03-20 16:20:34.28012762 +0000 UTC m=+1188.954464335" watchObservedRunningTime="2026-03-20 16:20:34.287905063 +0000 UTC m=+1188.962241788" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.312384 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.710883466 podStartE2EDuration="1m7.312366613s" podCreationTimestamp="2026-03-20 16:19:27 +0000 UTC" firstStartedPulling="2026-03-20 16:19:34.979572472 +0000 UTC m=+1129.653909187" lastFinishedPulling="2026-03-20 16:19:59.581055619 +0000 UTC m=+1154.255392334" observedRunningTime="2026-03-20 16:20:34.309850524 +0000 UTC m=+1188.984187239" watchObservedRunningTime="2026-03-20 16:20:34.312366613 +0000 UTC m=+1188.986703328" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.392258 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-930e-account-create-update-nzvsx"] Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.399683 4708 scope.go:117] "RemoveContainer" containerID="98775f120f1971e14fadf4bfad30684035cfefb356dbe98a0dc1f6db50275080" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.459037 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8c1b-account-create-update-r92gp"] Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.481191 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-b2jct"] Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.489245 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-b2jct"] Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.502377 4708 scope.go:117] "RemoveContainer" containerID="6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44" Mar 20 16:20:34 crc kubenswrapper[4708]: E0320 16:20:34.504433 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44\": container with ID starting with 6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44 not found: ID does not exist" containerID="6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.504486 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44"} err="failed to get container status \"6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44\": rpc error: code = NotFound desc = could not find container \"6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44\": container with ID starting with 6e9005459b1292019969bb9b0f01bc12c2683198d6e028f2b14d4d29ad45db44 not found: ID does not exist" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.504516 4708 scope.go:117] "RemoveContainer" containerID="98775f120f1971e14fadf4bfad30684035cfefb356dbe98a0dc1f6db50275080" Mar 20 16:20:34 crc kubenswrapper[4708]: E0320 16:20:34.508212 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98775f120f1971e14fadf4bfad30684035cfefb356dbe98a0dc1f6db50275080\": container with ID starting with 98775f120f1971e14fadf4bfad30684035cfefb356dbe98a0dc1f6db50275080 not found: ID does not exist" containerID="98775f120f1971e14fadf4bfad30684035cfefb356dbe98a0dc1f6db50275080" Mar 20 16:20:34 crc kubenswrapper[4708]: I0320 16:20:34.508266 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98775f120f1971e14fadf4bfad30684035cfefb356dbe98a0dc1f6db50275080"} err="failed to get container status \"98775f120f1971e14fadf4bfad30684035cfefb356dbe98a0dc1f6db50275080\": rpc error: code = NotFound desc = could not find container \"98775f120f1971e14fadf4bfad30684035cfefb356dbe98a0dc1f6db50275080\": container with ID starting with 98775f120f1971e14fadf4bfad30684035cfefb356dbe98a0dc1f6db50275080 not found: ID does not exist" Mar 20 16:20:35 crc kubenswrapper[4708]: I0320 16:20:35.271707 4708 generic.go:334] "Generic (PLEG): container finished" podID="e5e322f6-c3f7-4870-84ae-f27c1d4ba293" containerID="795384e065eb592d7a2cb0388d3a77603fdafcbc22b617d0066faf8b965f6be4" exitCode=0 Mar 20 16:20:35 crc kubenswrapper[4708]: I0320 16:20:35.271770 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8c1b-account-create-update-r92gp" event={"ID":"e5e322f6-c3f7-4870-84ae-f27c1d4ba293","Type":"ContainerDied","Data":"795384e065eb592d7a2cb0388d3a77603fdafcbc22b617d0066faf8b965f6be4"} Mar 20 16:20:35 crc kubenswrapper[4708]: I0320 16:20:35.272144 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8c1b-account-create-update-r92gp" event={"ID":"e5e322f6-c3f7-4870-84ae-f27c1d4ba293","Type":"ContainerStarted","Data":"eb2ee71fc2959d0853d116f4882682af7e9f5578b5585e9eacafd3e9b19f98b0"} Mar 20 16:20:35 crc kubenswrapper[4708]: I0320 16:20:35.276819 4708 generic.go:334] "Generic (PLEG): container finished" podID="50a31e27-283d-43b0-91b5-71c548d61e27" containerID="8fbe54b40e5f5c97037f62b168a6ff3392879081a253eee9d30bac66ba93c773" exitCode=0 Mar 20 16:20:35 crc kubenswrapper[4708]: I0320 16:20:35.276906 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-p756b" event={"ID":"50a31e27-283d-43b0-91b5-71c548d61e27","Type":"ContainerDied","Data":"8fbe54b40e5f5c97037f62b168a6ff3392879081a253eee9d30bac66ba93c773"} Mar 20 16:20:35 crc kubenswrapper[4708]: I0320 16:20:35.281282 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-930e-account-create-update-nzvsx" event={"ID":"799c71bd-9b76-4d2d-a4b9-6953e7ee2863","Type":"ContainerDied","Data":"213575d6a25a26b34c0003211c4961a6918a6086451952f45af358a92d64fc51"} Mar 20 16:20:35 crc kubenswrapper[4708]: I0320 16:20:35.281571 4708 generic.go:334] "Generic (PLEG): container finished" podID="799c71bd-9b76-4d2d-a4b9-6953e7ee2863" containerID="213575d6a25a26b34c0003211c4961a6918a6086451952f45af358a92d64fc51" exitCode=0 Mar 20 16:20:35 crc kubenswrapper[4708]: I0320 16:20:35.281875 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-930e-account-create-update-nzvsx" event={"ID":"799c71bd-9b76-4d2d-a4b9-6953e7ee2863","Type":"ContainerStarted","Data":"6fb099703bb5bacdf7f092cc0ca73be771c7b09e499ba6482ca2e0ce7c9ff901"} Mar 20 16:20:35 crc kubenswrapper[4708]: I0320 16:20:35.284370 4708 generic.go:334] "Generic (PLEG): container finished" podID="c03712fd-33bf-454a-a7f7-f907e9b9c0ec" containerID="0643e6442963881271a1a07da2269fb37ccd6145883356b6740dd9f138bd7230" exitCode=0 Mar 20 16:20:35 crc kubenswrapper[4708]: I0320 16:20:35.284453 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h7q7q" event={"ID":"c03712fd-33bf-454a-a7f7-f907e9b9c0ec","Type":"ContainerDied","Data":"0643e6442963881271a1a07da2269fb37ccd6145883356b6740dd9f138bd7230"} Mar 20 16:20:35 crc kubenswrapper[4708]: I0320 16:20:35.284705 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h7q7q" event={"ID":"c03712fd-33bf-454a-a7f7-f907e9b9c0ec","Type":"ContainerStarted","Data":"87bc82cdd6d20f0547e9c39f402ef94116dd83b8a52430a5a7e483757c8d8e51"} Mar 20 16:20:36 crc kubenswrapper[4708]: I0320 16:20:36.128486 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" path="/var/lib/kubelet/pods/4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2/volumes" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:36.772031 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-930e-account-create-update-nzvsx" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:36.900470 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjjdd\" (UniqueName: \"kubernetes.io/projected/799c71bd-9b76-4d2d-a4b9-6953e7ee2863-kube-api-access-wjjdd\") pod \"799c71bd-9b76-4d2d-a4b9-6953e7ee2863\" (UID: \"799c71bd-9b76-4d2d-a4b9-6953e7ee2863\") " Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:36.900678 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/799c71bd-9b76-4d2d-a4b9-6953e7ee2863-operator-scripts\") pod \"799c71bd-9b76-4d2d-a4b9-6953e7ee2863\" (UID: \"799c71bd-9b76-4d2d-a4b9-6953e7ee2863\") " Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:36.904323 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/799c71bd-9b76-4d2d-a4b9-6953e7ee2863-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "799c71bd-9b76-4d2d-a4b9-6953e7ee2863" (UID: "799c71bd-9b76-4d2d-a4b9-6953e7ee2863"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:36.919937 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799c71bd-9b76-4d2d-a4b9-6953e7ee2863-kube-api-access-wjjdd" (OuterVolumeSpecName: "kube-api-access-wjjdd") pod "799c71bd-9b76-4d2d-a4b9-6953e7ee2863" (UID: "799c71bd-9b76-4d2d-a4b9-6953e7ee2863"). InnerVolumeSpecName "kube-api-access-wjjdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:36.986974 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.002573 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/799c71bd-9b76-4d2d-a4b9-6953e7ee2863-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.002603 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjjdd\" (UniqueName: \"kubernetes.io/projected/799c71bd-9b76-4d2d-a4b9-6953e7ee2863-kube-api-access-wjjdd\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.191873 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-dw2jc"] Mar 20 16:20:37 crc kubenswrapper[4708]: E0320 16:20:37.192366 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c973f5-81e6-41d1-ad8c-7ab713e44a75" containerName="mariadb-account-create-update" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.192386 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c973f5-81e6-41d1-ad8c-7ab713e44a75" containerName="mariadb-account-create-update" Mar 20 16:20:37 crc kubenswrapper[4708]: E0320 16:20:37.192415 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799c71bd-9b76-4d2d-a4b9-6953e7ee2863" containerName="mariadb-account-create-update" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.192422 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="799c71bd-9b76-4d2d-a4b9-6953e7ee2863" containerName="mariadb-account-create-update" Mar 20 16:20:37 crc kubenswrapper[4708]: E0320 16:20:37.192442 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" containerName="init" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.192450 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" containerName="init" Mar 20 16:20:37 crc kubenswrapper[4708]: E0320 16:20:37.192474 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" containerName="dnsmasq-dns" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.192481 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" containerName="dnsmasq-dns" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.192659 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c973f5-81e6-41d1-ad8c-7ab713e44a75" containerName="mariadb-account-create-update" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.192685 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb7af17-adad-4cc2-a3f3-95a4f7b9d5c2" containerName="dnsmasq-dns" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.192721 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="799c71bd-9b76-4d2d-a4b9-6953e7ee2863" containerName="mariadb-account-create-update" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.193380 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dw2jc" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.199463 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dw2jc"] Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.301221 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3dc6-account-create-update-cqsnp"] Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.306662 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3dc6-account-create-update-cqsnp" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.306744 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-930e-account-create-update-nzvsx" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.306725 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-930e-account-create-update-nzvsx" event={"ID":"799c71bd-9b76-4d2d-a4b9-6953e7ee2863","Type":"ContainerDied","Data":"6fb099703bb5bacdf7f092cc0ca73be771c7b09e499ba6482ca2e0ce7c9ff901"} Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.306780 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fb099703bb5bacdf7f092cc0ca73be771c7b09e499ba6482ca2e0ce7c9ff901" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.311606 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.312559 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snp49\" (UniqueName: \"kubernetes.io/projected/720c033b-2069-47be-b543-00c6005b496b-kube-api-access-snp49\") pod \"glance-db-create-dw2jc\" (UID: \"720c033b-2069-47be-b543-00c6005b496b\") " pod="openstack/glance-db-create-dw2jc" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.312883 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/720c033b-2069-47be-b543-00c6005b496b-operator-scripts\") pod \"glance-db-create-dw2jc\" (UID: \"720c033b-2069-47be-b543-00c6005b496b\") " pod="openstack/glance-db-create-dw2jc" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.316548 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3dc6-account-create-update-cqsnp"] Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.414998 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/720c033b-2069-47be-b543-00c6005b496b-operator-scripts\") pod \"glance-db-create-dw2jc\" (UID: \"720c033b-2069-47be-b543-00c6005b496b\") " pod="openstack/glance-db-create-dw2jc" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.415491 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snp49\" (UniqueName: \"kubernetes.io/projected/720c033b-2069-47be-b543-00c6005b496b-kube-api-access-snp49\") pod \"glance-db-create-dw2jc\" (UID: \"720c033b-2069-47be-b543-00c6005b496b\") " pod="openstack/glance-db-create-dw2jc" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.415527 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5609997e-9b3b-4472-a91a-0948eacb77f1-operator-scripts\") pod \"glance-3dc6-account-create-update-cqsnp\" (UID: \"5609997e-9b3b-4472-a91a-0948eacb77f1\") " pod="openstack/glance-3dc6-account-create-update-cqsnp" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.415562 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mbng\" (UniqueName: \"kubernetes.io/projected/5609997e-9b3b-4472-a91a-0948eacb77f1-kube-api-access-5mbng\") pod \"glance-3dc6-account-create-update-cqsnp\" (UID: \"5609997e-9b3b-4472-a91a-0948eacb77f1\") " pod="openstack/glance-3dc6-account-create-update-cqsnp" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.416457 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/720c033b-2069-47be-b543-00c6005b496b-operator-scripts\") pod \"glance-db-create-dw2jc\" (UID: \"720c033b-2069-47be-b543-00c6005b496b\") " pod="openstack/glance-db-create-dw2jc" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.453684 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snp49\" (UniqueName: \"kubernetes.io/projected/720c033b-2069-47be-b543-00c6005b496b-kube-api-access-snp49\") pod \"glance-db-create-dw2jc\" (UID: \"720c033b-2069-47be-b543-00c6005b496b\") " pod="openstack/glance-db-create-dw2jc" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.492891 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-p756b" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.496869 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8c1b-account-create-update-r92gp" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.508747 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h7q7q" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.517011 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5609997e-9b3b-4472-a91a-0948eacb77f1-operator-scripts\") pod \"glance-3dc6-account-create-update-cqsnp\" (UID: \"5609997e-9b3b-4472-a91a-0948eacb77f1\") " pod="openstack/glance-3dc6-account-create-update-cqsnp" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.517058 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mbng\" (UniqueName: \"kubernetes.io/projected/5609997e-9b3b-4472-a91a-0948eacb77f1-kube-api-access-5mbng\") pod \"glance-3dc6-account-create-update-cqsnp\" (UID: \"5609997e-9b3b-4472-a91a-0948eacb77f1\") " pod="openstack/glance-3dc6-account-create-update-cqsnp" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.518043 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5609997e-9b3b-4472-a91a-0948eacb77f1-operator-scripts\") pod \"glance-3dc6-account-create-update-cqsnp\" (UID: \"5609997e-9b3b-4472-a91a-0948eacb77f1\") " pod="openstack/glance-3dc6-account-create-update-cqsnp" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.523171 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dw2jc" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.538690 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mbng\" (UniqueName: \"kubernetes.io/projected/5609997e-9b3b-4472-a91a-0948eacb77f1-kube-api-access-5mbng\") pod \"glance-3dc6-account-create-update-cqsnp\" (UID: \"5609997e-9b3b-4472-a91a-0948eacb77f1\") " pod="openstack/glance-3dc6-account-create-update-cqsnp" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.617785 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmpd2\" (UniqueName: \"kubernetes.io/projected/e5e322f6-c3f7-4870-84ae-f27c1d4ba293-kube-api-access-pmpd2\") pod \"e5e322f6-c3f7-4870-84ae-f27c1d4ba293\" (UID: \"e5e322f6-c3f7-4870-84ae-f27c1d4ba293\") " Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.617998 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5e322f6-c3f7-4870-84ae-f27c1d4ba293-operator-scripts\") pod \"e5e322f6-c3f7-4870-84ae-f27c1d4ba293\" (UID: \"e5e322f6-c3f7-4870-84ae-f27c1d4ba293\") " Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.618041 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w59h\" (UniqueName: \"kubernetes.io/projected/50a31e27-283d-43b0-91b5-71c548d61e27-kube-api-access-2w59h\") pod \"50a31e27-283d-43b0-91b5-71c548d61e27\" (UID: \"50a31e27-283d-43b0-91b5-71c548d61e27\") " Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.618073 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c03712fd-33bf-454a-a7f7-f907e9b9c0ec-operator-scripts\") pod \"c03712fd-33bf-454a-a7f7-f907e9b9c0ec\" (UID: \"c03712fd-33bf-454a-a7f7-f907e9b9c0ec\") " Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.618102 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nvsf\" (UniqueName: \"kubernetes.io/projected/c03712fd-33bf-454a-a7f7-f907e9b9c0ec-kube-api-access-4nvsf\") pod \"c03712fd-33bf-454a-a7f7-f907e9b9c0ec\" (UID: \"c03712fd-33bf-454a-a7f7-f907e9b9c0ec\") " Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.618176 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50a31e27-283d-43b0-91b5-71c548d61e27-operator-scripts\") pod \"50a31e27-283d-43b0-91b5-71c548d61e27\" (UID: \"50a31e27-283d-43b0-91b5-71c548d61e27\") " Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.619442 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50a31e27-283d-43b0-91b5-71c548d61e27-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50a31e27-283d-43b0-91b5-71c548d61e27" (UID: "50a31e27-283d-43b0-91b5-71c548d61e27"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.625317 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5e322f6-c3f7-4870-84ae-f27c1d4ba293-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5e322f6-c3f7-4870-84ae-f27c1d4ba293" (UID: "e5e322f6-c3f7-4870-84ae-f27c1d4ba293"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.625577 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03712fd-33bf-454a-a7f7-f907e9b9c0ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c03712fd-33bf-454a-a7f7-f907e9b9c0ec" (UID: "c03712fd-33bf-454a-a7f7-f907e9b9c0ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.626176 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3dc6-account-create-update-cqsnp" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.630296 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a31e27-283d-43b0-91b5-71c548d61e27-kube-api-access-2w59h" (OuterVolumeSpecName: "kube-api-access-2w59h") pod "50a31e27-283d-43b0-91b5-71c548d61e27" (UID: "50a31e27-283d-43b0-91b5-71c548d61e27"). InnerVolumeSpecName "kube-api-access-2w59h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.632752 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03712fd-33bf-454a-a7f7-f907e9b9c0ec-kube-api-access-4nvsf" (OuterVolumeSpecName: "kube-api-access-4nvsf") pod "c03712fd-33bf-454a-a7f7-f907e9b9c0ec" (UID: "c03712fd-33bf-454a-a7f7-f907e9b9c0ec"). InnerVolumeSpecName "kube-api-access-4nvsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.636475 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e322f6-c3f7-4870-84ae-f27c1d4ba293-kube-api-access-pmpd2" (OuterVolumeSpecName: "kube-api-access-pmpd2") pod "e5e322f6-c3f7-4870-84ae-f27c1d4ba293" (UID: "e5e322f6-c3f7-4870-84ae-f27c1d4ba293"). InnerVolumeSpecName "kube-api-access-pmpd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.720859 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5e322f6-c3f7-4870-84ae-f27c1d4ba293-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.720903 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w59h\" (UniqueName: \"kubernetes.io/projected/50a31e27-283d-43b0-91b5-71c548d61e27-kube-api-access-2w59h\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.720916 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c03712fd-33bf-454a-a7f7-f907e9b9c0ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.720930 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nvsf\" (UniqueName: \"kubernetes.io/projected/c03712fd-33bf-454a-a7f7-f907e9b9c0ec-kube-api-access-4nvsf\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.720941 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50a31e27-283d-43b0-91b5-71c548d61e27-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.720952 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmpd2\" (UniqueName: \"kubernetes.io/projected/e5e322f6-c3f7-4870-84ae-f27c1d4ba293-kube-api-access-pmpd2\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:37 crc kubenswrapper[4708]: I0320 16:20:37.933787 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dw2jc"] Mar 20 16:20:38 crc kubenswrapper[4708]: W0320 16:20:38.249903 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5609997e_9b3b_4472_a91a_0948eacb77f1.slice/crio-0e01d8f1e18eb30519d83c27d5e367d6cd19845d2a6bfd10adc813694034028d WatchSource:0}: Error finding container 0e01d8f1e18eb30519d83c27d5e367d6cd19845d2a6bfd10adc813694034028d: Status 404 returned error can't find the container with id 0e01d8f1e18eb30519d83c27d5e367d6cd19845d2a6bfd10adc813694034028d Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.254061 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3dc6-account-create-update-cqsnp"] Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.319521 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dw2jc" event={"ID":"720c033b-2069-47be-b543-00c6005b496b","Type":"ContainerStarted","Data":"30da198c7b5099cb93e2396975abb01d064840c33daffa22b7fc8b0ec08090b0"} Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.319576 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dw2jc" event={"ID":"720c033b-2069-47be-b543-00c6005b496b","Type":"ContainerStarted","Data":"aaa30f1db8263db44b60389cdeab04a585d523b4561a1c7770df02d0111113fa"} Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.322313 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3dc6-account-create-update-cqsnp" event={"ID":"5609997e-9b3b-4472-a91a-0948eacb77f1","Type":"ContainerStarted","Data":"0e01d8f1e18eb30519d83c27d5e367d6cd19845d2a6bfd10adc813694034028d"} Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.324460 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-p756b" event={"ID":"50a31e27-283d-43b0-91b5-71c548d61e27","Type":"ContainerDied","Data":"fee703f4c5bba3d9df964c52f8836f36d16b8c2caeeab07c785e28a856f6f582"} Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.324498 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fee703f4c5bba3d9df964c52f8836f36d16b8c2caeeab07c785e28a856f6f582" Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.324591 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-p756b" Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.327239 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h7q7q" event={"ID":"c03712fd-33bf-454a-a7f7-f907e9b9c0ec","Type":"ContainerDied","Data":"87bc82cdd6d20f0547e9c39f402ef94116dd83b8a52430a5a7e483757c8d8e51"} Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.327282 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87bc82cdd6d20f0547e9c39f402ef94116dd83b8a52430a5a7e483757c8d8e51" Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.327363 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h7q7q" Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.334080 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8c1b-account-create-update-r92gp" event={"ID":"e5e322f6-c3f7-4870-84ae-f27c1d4ba293","Type":"ContainerDied","Data":"eb2ee71fc2959d0853d116f4882682af7e9f5578b5585e9eacafd3e9b19f98b0"} Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.334395 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb2ee71fc2959d0853d116f4882682af7e9f5578b5585e9eacafd3e9b19f98b0" Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.334194 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8c1b-account-create-update-r92gp" Mar 20 16:20:38 crc kubenswrapper[4708]: I0320 16:20:38.340785 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-dw2jc" podStartSLOduration=1.340765953 podStartE2EDuration="1.340765953s" podCreationTimestamp="2026-03-20 16:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:20:38.336313032 +0000 UTC m=+1193.010649747" watchObservedRunningTime="2026-03-20 16:20:38.340765953 +0000 UTC m=+1193.015102668" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.077931 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-k8vjm"] Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.085268 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-k8vjm"] Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.173154 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bw6zk"] Mar 20 16:20:39 crc kubenswrapper[4708]: E0320 16:20:39.173836 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03712fd-33bf-454a-a7f7-f907e9b9c0ec" containerName="mariadb-database-create" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.173853 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03712fd-33bf-454a-a7f7-f907e9b9c0ec" containerName="mariadb-database-create" Mar 20 16:20:39 crc kubenswrapper[4708]: E0320 16:20:39.173868 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e322f6-c3f7-4870-84ae-f27c1d4ba293" containerName="mariadb-account-create-update" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.173875 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e322f6-c3f7-4870-84ae-f27c1d4ba293" containerName="mariadb-account-create-update" Mar 20 16:20:39 crc kubenswrapper[4708]: E0320 16:20:39.173890 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a31e27-283d-43b0-91b5-71c548d61e27" containerName="mariadb-database-create" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.173896 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a31e27-283d-43b0-91b5-71c548d61e27" containerName="mariadb-database-create" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.174059 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e322f6-c3f7-4870-84ae-f27c1d4ba293" containerName="mariadb-account-create-update" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.174078 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03712fd-33bf-454a-a7f7-f907e9b9c0ec" containerName="mariadb-database-create" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.174097 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a31e27-283d-43b0-91b5-71c548d61e27" containerName="mariadb-database-create" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.174674 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bw6zk" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.178643 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.181102 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bw6zk"] Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.248372 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc-operator-scripts\") pod \"root-account-create-update-bw6zk\" (UID: \"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc\") " pod="openstack/root-account-create-update-bw6zk" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.248473 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv4sk\" (UniqueName: \"kubernetes.io/projected/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc-kube-api-access-bv4sk\") pod \"root-account-create-update-bw6zk\" (UID: \"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc\") " pod="openstack/root-account-create-update-bw6zk" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.342938 4708 generic.go:334] "Generic (PLEG): container finished" podID="720c033b-2069-47be-b543-00c6005b496b" containerID="30da198c7b5099cb93e2396975abb01d064840c33daffa22b7fc8b0ec08090b0" exitCode=0 Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.342991 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dw2jc" event={"ID":"720c033b-2069-47be-b543-00c6005b496b","Type":"ContainerDied","Data":"30da198c7b5099cb93e2396975abb01d064840c33daffa22b7fc8b0ec08090b0"} Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.346321 4708 generic.go:334] "Generic (PLEG): container finished" podID="5609997e-9b3b-4472-a91a-0948eacb77f1" containerID="2c0452b3fef5af38801341cd45e074b156bd620faca4a5a65af79d8cdcac09e5" exitCode=0 Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.346364 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3dc6-account-create-update-cqsnp" event={"ID":"5609997e-9b3b-4472-a91a-0948eacb77f1","Type":"ContainerDied","Data":"2c0452b3fef5af38801341cd45e074b156bd620faca4a5a65af79d8cdcac09e5"} Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.350760 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc-operator-scripts\") pod \"root-account-create-update-bw6zk\" (UID: \"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc\") " pod="openstack/root-account-create-update-bw6zk" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.350838 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv4sk\" (UniqueName: \"kubernetes.io/projected/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc-kube-api-access-bv4sk\") pod \"root-account-create-update-bw6zk\" (UID: \"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc\") " pod="openstack/root-account-create-update-bw6zk" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.352245 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc-operator-scripts\") pod \"root-account-create-update-bw6zk\" (UID: \"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc\") " pod="openstack/root-account-create-update-bw6zk" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.371249 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv4sk\" (UniqueName: \"kubernetes.io/projected/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc-kube-api-access-bv4sk\") pod \"root-account-create-update-bw6zk\" (UID: \"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc\") " pod="openstack/root-account-create-update-bw6zk" Mar 20 16:20:39 crc kubenswrapper[4708]: I0320 16:20:39.539943 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bw6zk" Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.006626 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bw6zk"] Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.123990 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c973f5-81e6-41d1-ad8c-7ab713e44a75" path="/var/lib/kubelet/pods/a2c973f5-81e6-41d1-ad8c-7ab713e44a75/volumes" Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.356291 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bw6zk" event={"ID":"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc","Type":"ContainerStarted","Data":"b9631cc0460a6cf01a43dcf7e777b24acea1d0e2a5016feaa71440eec26f3164"} Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.356330 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bw6zk" event={"ID":"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc","Type":"ContainerStarted","Data":"926852771eb85534f6acc4222311f18c0daefcc657d99a45d3f05016d6fad63a"} Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.696496 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3dc6-account-create-update-cqsnp" Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.721990 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-bw6zk" podStartSLOduration=1.7219646819999999 podStartE2EDuration="1.721964682s" podCreationTimestamp="2026-03-20 16:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:20:40.374324555 +0000 UTC m=+1195.048661280" watchObservedRunningTime="2026-03-20 16:20:40.721964682 +0000 UTC m=+1195.396301397" Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.748694 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dw2jc" Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.785827 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5609997e-9b3b-4472-a91a-0948eacb77f1-operator-scripts\") pod \"5609997e-9b3b-4472-a91a-0948eacb77f1\" (UID: \"5609997e-9b3b-4472-a91a-0948eacb77f1\") " Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.786042 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/720c033b-2069-47be-b543-00c6005b496b-operator-scripts\") pod \"720c033b-2069-47be-b543-00c6005b496b\" (UID: \"720c033b-2069-47be-b543-00c6005b496b\") " Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.786087 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mbng\" (UniqueName: \"kubernetes.io/projected/5609997e-9b3b-4472-a91a-0948eacb77f1-kube-api-access-5mbng\") pod \"5609997e-9b3b-4472-a91a-0948eacb77f1\" (UID: \"5609997e-9b3b-4472-a91a-0948eacb77f1\") " Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.786137 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snp49\" (UniqueName: \"kubernetes.io/projected/720c033b-2069-47be-b543-00c6005b496b-kube-api-access-snp49\") pod \"720c033b-2069-47be-b543-00c6005b496b\" (UID: \"720c033b-2069-47be-b543-00c6005b496b\") " Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.792038 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720c033b-2069-47be-b543-00c6005b496b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "720c033b-2069-47be-b543-00c6005b496b" (UID: "720c033b-2069-47be-b543-00c6005b496b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.792146 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5609997e-9b3b-4472-a91a-0948eacb77f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5609997e-9b3b-4472-a91a-0948eacb77f1" (UID: "5609997e-9b3b-4472-a91a-0948eacb77f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.795045 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720c033b-2069-47be-b543-00c6005b496b-kube-api-access-snp49" (OuterVolumeSpecName: "kube-api-access-snp49") pod "720c033b-2069-47be-b543-00c6005b496b" (UID: "720c033b-2069-47be-b543-00c6005b496b"). InnerVolumeSpecName "kube-api-access-snp49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.796126 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5609997e-9b3b-4472-a91a-0948eacb77f1-kube-api-access-5mbng" (OuterVolumeSpecName: "kube-api-access-5mbng") pod "5609997e-9b3b-4472-a91a-0948eacb77f1" (UID: "5609997e-9b3b-4472-a91a-0948eacb77f1"). InnerVolumeSpecName "kube-api-access-5mbng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.889340 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/720c033b-2069-47be-b543-00c6005b496b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.889379 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mbng\" (UniqueName: \"kubernetes.io/projected/5609997e-9b3b-4472-a91a-0948eacb77f1-kube-api-access-5mbng\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.889396 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snp49\" (UniqueName: \"kubernetes.io/projected/720c033b-2069-47be-b543-00c6005b496b-kube-api-access-snp49\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:40 crc kubenswrapper[4708]: I0320 16:20:40.889409 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5609997e-9b3b-4472-a91a-0948eacb77f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:41 crc kubenswrapper[4708]: I0320 16:20:41.367596 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3dc6-account-create-update-cqsnp" Mar 20 16:20:41 crc kubenswrapper[4708]: I0320 16:20:41.367609 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3dc6-account-create-update-cqsnp" event={"ID":"5609997e-9b3b-4472-a91a-0948eacb77f1","Type":"ContainerDied","Data":"0e01d8f1e18eb30519d83c27d5e367d6cd19845d2a6bfd10adc813694034028d"} Mar 20 16:20:41 crc kubenswrapper[4708]: I0320 16:20:41.367993 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e01d8f1e18eb30519d83c27d5e367d6cd19845d2a6bfd10adc813694034028d" Mar 20 16:20:41 crc kubenswrapper[4708]: I0320 16:20:41.369356 4708 generic.go:334] "Generic (PLEG): container finished" podID="c397eb69-e70b-4a8b-8c3f-162c06ccc6bc" containerID="b9631cc0460a6cf01a43dcf7e777b24acea1d0e2a5016feaa71440eec26f3164" exitCode=0 Mar 20 16:20:41 crc kubenswrapper[4708]: I0320 16:20:41.369430 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bw6zk" event={"ID":"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc","Type":"ContainerDied","Data":"b9631cc0460a6cf01a43dcf7e777b24acea1d0e2a5016feaa71440eec26f3164"} Mar 20 16:20:41 crc kubenswrapper[4708]: I0320 16:20:41.371754 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dw2jc" event={"ID":"720c033b-2069-47be-b543-00c6005b496b","Type":"ContainerDied","Data":"aaa30f1db8263db44b60389cdeab04a585d523b4561a1c7770df02d0111113fa"} Mar 20 16:20:41 crc kubenswrapper[4708]: I0320 16:20:41.371785 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaa30f1db8263db44b60389cdeab04a585d523b4561a1c7770df02d0111113fa" Mar 20 16:20:41 crc kubenswrapper[4708]: I0320 16:20:41.371830 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dw2jc" Mar 20 16:20:41 crc kubenswrapper[4708]: E0320 16:20:41.534969 4708 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod720c033b_2069_47be_b543_00c6005b496b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5609997e_9b3b_4472_a91a_0948eacb77f1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod720c033b_2069_47be_b543_00c6005b496b.slice/crio-aaa30f1db8263db44b60389cdeab04a585d523b4561a1c7770df02d0111113fa\": RecentStats: unable to find data in memory cache]" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.448043 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-rpnp8"] Mar 20 16:20:42 crc kubenswrapper[4708]: E0320 16:20:42.448523 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5609997e-9b3b-4472-a91a-0948eacb77f1" containerName="mariadb-account-create-update" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.448543 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="5609997e-9b3b-4472-a91a-0948eacb77f1" containerName="mariadb-account-create-update" Mar 20 16:20:42 crc kubenswrapper[4708]: E0320 16:20:42.448579 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720c033b-2069-47be-b543-00c6005b496b" containerName="mariadb-database-create" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.448588 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="720c033b-2069-47be-b543-00c6005b496b" containerName="mariadb-database-create" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.448783 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="720c033b-2069-47be-b543-00c6005b496b" containerName="mariadb-database-create" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.448808 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="5609997e-9b3b-4472-a91a-0948eacb77f1" containerName="mariadb-account-create-update" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.449519 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.452557 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.453743 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fkxnk" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.462181 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rpnp8"] Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.521195 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-config-data\") pod \"glance-db-sync-rpnp8\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.521269 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-combined-ca-bundle\") pod \"glance-db-sync-rpnp8\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.521527 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-db-sync-config-data\") pod \"glance-db-sync-rpnp8\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.521661 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jkq7\" (UniqueName: \"kubernetes.io/projected/e73e6a53-ccd4-45bf-ad96-a6de1e696888-kube-api-access-4jkq7\") pod \"glance-db-sync-rpnp8\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.623470 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-db-sync-config-data\") pod \"glance-db-sync-rpnp8\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.623528 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jkq7\" (UniqueName: \"kubernetes.io/projected/e73e6a53-ccd4-45bf-ad96-a6de1e696888-kube-api-access-4jkq7\") pod \"glance-db-sync-rpnp8\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.623613 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-config-data\") pod \"glance-db-sync-rpnp8\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.623640 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-combined-ca-bundle\") pod \"glance-db-sync-rpnp8\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.631878 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-db-sync-config-data\") pod \"glance-db-sync-rpnp8\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.631941 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-combined-ca-bundle\") pod \"glance-db-sync-rpnp8\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.631969 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-config-data\") pod \"glance-db-sync-rpnp8\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.643436 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jkq7\" (UniqueName: \"kubernetes.io/projected/e73e6a53-ccd4-45bf-ad96-a6de1e696888-kube-api-access-4jkq7\") pod \"glance-db-sync-rpnp8\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.740334 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bw6zk" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.769831 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rpnp8" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.826535 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv4sk\" (UniqueName: \"kubernetes.io/projected/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc-kube-api-access-bv4sk\") pod \"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc\" (UID: \"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc\") " Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.826719 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc-operator-scripts\") pod \"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc\" (UID: \"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc\") " Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.827204 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c397eb69-e70b-4a8b-8c3f-162c06ccc6bc" (UID: "c397eb69-e70b-4a8b-8c3f-162c06ccc6bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.827531 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.831075 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc-kube-api-access-bv4sk" (OuterVolumeSpecName: "kube-api-access-bv4sk") pod "c397eb69-e70b-4a8b-8c3f-162c06ccc6bc" (UID: "c397eb69-e70b-4a8b-8c3f-162c06ccc6bc"). InnerVolumeSpecName "kube-api-access-bv4sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:20:42 crc kubenswrapper[4708]: I0320 16:20:42.928986 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv4sk\" (UniqueName: \"kubernetes.io/projected/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc-kube-api-access-bv4sk\") on node \"crc\" DevicePath \"\"" Mar 20 16:20:43 crc kubenswrapper[4708]: I0320 16:20:43.339340 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 16:20:43 crc kubenswrapper[4708]: I0320 16:20:43.341280 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rpnp8"] Mar 20 16:20:43 crc kubenswrapper[4708]: W0320 16:20:43.343275 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode73e6a53_ccd4_45bf_ad96_a6de1e696888.slice/crio-eff40708df8220974aab932d0dba72e5682c7cc92ee1fc0117de989f1113dae5 WatchSource:0}: Error finding container eff40708df8220974aab932d0dba72e5682c7cc92ee1fc0117de989f1113dae5: Status 404 returned error can't find the container with id eff40708df8220974aab932d0dba72e5682c7cc92ee1fc0117de989f1113dae5 Mar 20 16:20:43 crc kubenswrapper[4708]: I0320 16:20:43.387693 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bw6zk" event={"ID":"c397eb69-e70b-4a8b-8c3f-162c06ccc6bc","Type":"ContainerDied","Data":"926852771eb85534f6acc4222311f18c0daefcc657d99a45d3f05016d6fad63a"} Mar 20 16:20:43 crc kubenswrapper[4708]: I0320 16:20:43.387731 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="926852771eb85534f6acc4222311f18c0daefcc657d99a45d3f05016d6fad63a" Mar 20 16:20:43 crc kubenswrapper[4708]: I0320 16:20:43.387801 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bw6zk" Mar 20 16:20:43 crc kubenswrapper[4708]: I0320 16:20:43.391420 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rpnp8" event={"ID":"e73e6a53-ccd4-45bf-ad96-a6de1e696888","Type":"ContainerStarted","Data":"eff40708df8220974aab932d0dba72e5682c7cc92ee1fc0117de989f1113dae5"} Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.409136 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhwvr"] Mar 20 16:20:44 crc kubenswrapper[4708]: E0320 16:20:44.413964 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c397eb69-e70b-4a8b-8c3f-162c06ccc6bc" containerName="mariadb-account-create-update" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.413997 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="c397eb69-e70b-4a8b-8c3f-162c06ccc6bc" containerName="mariadb-account-create-update" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.414232 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="c397eb69-e70b-4a8b-8c3f-162c06ccc6bc" containerName="mariadb-account-create-update" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.415094 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.455993 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhwvr"] Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.464048 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-dns-svc\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.464147 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.464172 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz89f\" (UniqueName: \"kubernetes.io/projected/3dc07138-7824-4a54-957c-52e9dd4120ac-kube-api-access-hz89f\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.464217 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.464281 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-config\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.573661 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-config\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.573767 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-dns-svc\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.573815 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.573836 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz89f\" (UniqueName: \"kubernetes.io/projected/3dc07138-7824-4a54-957c-52e9dd4120ac-kube-api-access-hz89f\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.573872 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.575225 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-config\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.583173 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.592524 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-dns-svc\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.593212 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.635382 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz89f\" (UniqueName: \"kubernetes.io/projected/3dc07138-7824-4a54-957c-52e9dd4120ac-kube-api-access-hz89f\") pod \"dnsmasq-dns-698758b865-qhwvr\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:44 crc kubenswrapper[4708]: I0320 16:20:44.761154 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.326645 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhwvr"] Mar 20 16:20:45 crc kubenswrapper[4708]: W0320 16:20:45.331236 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dc07138_7824_4a54_957c_52e9dd4120ac.slice/crio-7ff5033d3c65101149ec8da4d40eb9c6ff7d2759b09aadb0674847c1c30afad6 WatchSource:0}: Error finding container 7ff5033d3c65101149ec8da4d40eb9c6ff7d2759b09aadb0674847c1c30afad6: Status 404 returned error can't find the container with id 7ff5033d3c65101149ec8da4d40eb9c6ff7d2759b09aadb0674847c1c30afad6 Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.418313 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhwvr" event={"ID":"3dc07138-7824-4a54-957c-52e9dd4120ac","Type":"ContainerStarted","Data":"7ff5033d3c65101149ec8da4d40eb9c6ff7d2759b09aadb0674847c1c30afad6"} Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.584193 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.591522 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.596645 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-dzz6k" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.596909 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.597025 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.600925 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.607634 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.700033 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.700084 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b88259-a142-42a0-ab2c-bb0980ad9465-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.700142 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b5b88259-a142-42a0-ab2c-bb0980ad9465-lock\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.700173 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mshfl\" (UniqueName: \"kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-kube-api-access-mshfl\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.700210 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.700231 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b88259-a142-42a0-ab2c-bb0980ad9465-cache\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.801338 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mshfl\" (UniqueName: \"kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-kube-api-access-mshfl\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.801420 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.801457 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b88259-a142-42a0-ab2c-bb0980ad9465-cache\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.801517 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.801542 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b88259-a142-42a0-ab2c-bb0980ad9465-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.801593 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b5b88259-a142-42a0-ab2c-bb0980ad9465-lock\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: E0320 16:20:45.801944 4708 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 16:20:45 crc kubenswrapper[4708]: E0320 16:20:45.801977 4708 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 16:20:45 crc kubenswrapper[4708]: E0320 16:20:45.802041 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift podName:b5b88259-a142-42a0-ab2c-bb0980ad9465 nodeName:}" failed. No retries permitted until 2026-03-20 16:20:46.302020325 +0000 UTC m=+1200.976357040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift") pod "swift-storage-0" (UID: "b5b88259-a142-42a0-ab2c-bb0980ad9465") : configmap "swift-ring-files" not found Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.802325 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b5b88259-a142-42a0-ab2c-bb0980ad9465-lock\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.802993 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.807871 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b88259-a142-42a0-ab2c-bb0980ad9465-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.818488 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b88259-a142-42a0-ab2c-bb0980ad9465-cache\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.826723 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mshfl\" (UniqueName: \"kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-kube-api-access-mshfl\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:45 crc kubenswrapper[4708]: I0320 16:20:45.830731 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.206216 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-s6247"] Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.207847 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.210465 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.210567 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.211483 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.228794 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s6247"] Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.317178 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.317252 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pkpq\" (UniqueName: \"kubernetes.io/projected/ad381906-1b90-4230-a435-9ed844232ba1-kube-api-access-2pkpq\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.317281 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-combined-ca-bundle\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.317318 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-swiftconf\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: E0320 16:20:46.317372 4708 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 16:20:46 crc kubenswrapper[4708]: E0320 16:20:46.317400 4708 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.317411 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-dispersionconf\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.317445 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad381906-1b90-4230-a435-9ed844232ba1-etc-swift\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: E0320 16:20:46.317490 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift podName:b5b88259-a142-42a0-ab2c-bb0980ad9465 nodeName:}" failed. No retries permitted until 2026-03-20 16:20:47.317469366 +0000 UTC m=+1201.991806161 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift") pod "swift-storage-0" (UID: "b5b88259-a142-42a0-ab2c-bb0980ad9465") : configmap "swift-ring-files" not found Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.317550 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad381906-1b90-4230-a435-9ed844232ba1-scripts\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.317592 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad381906-1b90-4230-a435-9ed844232ba1-ring-data-devices\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.419915 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad381906-1b90-4230-a435-9ed844232ba1-scripts\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.419968 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad381906-1b90-4230-a435-9ed844232ba1-ring-data-devices\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.420203 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pkpq\" (UniqueName: \"kubernetes.io/projected/ad381906-1b90-4230-a435-9ed844232ba1-kube-api-access-2pkpq\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.420236 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-combined-ca-bundle\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.420285 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-swiftconf\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.420332 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-dispersionconf\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.420357 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad381906-1b90-4230-a435-9ed844232ba1-etc-swift\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.420826 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad381906-1b90-4230-a435-9ed844232ba1-etc-swift\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.421882 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad381906-1b90-4230-a435-9ed844232ba1-scripts\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.422103 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad381906-1b90-4230-a435-9ed844232ba1-ring-data-devices\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.426301 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-combined-ca-bundle\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.426724 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-swiftconf\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.427748 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-dispersionconf\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.433013 4708 generic.go:334] "Generic (PLEG): container finished" podID="3dc07138-7824-4a54-957c-52e9dd4120ac" containerID="55c5f797db2a66deea35d687a66aef76774721fa4fbf12e519b6397bb7b2ffb7" exitCode=0 Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.433058 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhwvr" event={"ID":"3dc07138-7824-4a54-957c-52e9dd4120ac","Type":"ContainerDied","Data":"55c5f797db2a66deea35d687a66aef76774721fa4fbf12e519b6397bb7b2ffb7"} Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.445638 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pkpq\" (UniqueName: \"kubernetes.io/projected/ad381906-1b90-4230-a435-9ed844232ba1-kube-api-access-2pkpq\") pod \"swift-ring-rebalance-s6247\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.544114 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-dzz6k" Mar 20 16:20:46 crc kubenswrapper[4708]: I0320 16:20:46.552531 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:20:47 crc kubenswrapper[4708]: I0320 16:20:47.123969 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s6247"] Mar 20 16:20:47 crc kubenswrapper[4708]: W0320 16:20:47.135053 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad381906_1b90_4230_a435_9ed844232ba1.slice/crio-ca0586e2c8b4fb388de9b5436530f579afce5846274a2766a30ca656d1785491 WatchSource:0}: Error finding container ca0586e2c8b4fb388de9b5436530f579afce5846274a2766a30ca656d1785491: Status 404 returned error can't find the container with id ca0586e2c8b4fb388de9b5436530f579afce5846274a2766a30ca656d1785491 Mar 20 16:20:47 crc kubenswrapper[4708]: I0320 16:20:47.342623 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:47 crc kubenswrapper[4708]: E0320 16:20:47.342813 4708 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 16:20:47 crc kubenswrapper[4708]: E0320 16:20:47.342828 4708 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 16:20:47 crc kubenswrapper[4708]: E0320 16:20:47.342877 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift podName:b5b88259-a142-42a0-ab2c-bb0980ad9465 nodeName:}" failed. No retries permitted until 2026-03-20 16:20:49.342863088 +0000 UTC m=+1204.017199803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift") pod "swift-storage-0" (UID: "b5b88259-a142-42a0-ab2c-bb0980ad9465") : configmap "swift-ring-files" not found Mar 20 16:20:47 crc kubenswrapper[4708]: I0320 16:20:47.446884 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhwvr" event={"ID":"3dc07138-7824-4a54-957c-52e9dd4120ac","Type":"ContainerStarted","Data":"864ab416fb5ece0fa3759dfb52b714e9a2d4af6178734457b61c131fb529c35a"} Mar 20 16:20:47 crc kubenswrapper[4708]: I0320 16:20:47.446996 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:47 crc kubenswrapper[4708]: I0320 16:20:47.449337 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s6247" event={"ID":"ad381906-1b90-4230-a435-9ed844232ba1","Type":"ContainerStarted","Data":"ca0586e2c8b4fb388de9b5436530f579afce5846274a2766a30ca656d1785491"} Mar 20 16:20:47 crc kubenswrapper[4708]: I0320 16:20:47.470558 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-qhwvr" podStartSLOduration=3.470539833 podStartE2EDuration="3.470539833s" podCreationTimestamp="2026-03-20 16:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:20:47.468808976 +0000 UTC m=+1202.143145701" watchObservedRunningTime="2026-03-20 16:20:47.470539833 +0000 UTC m=+1202.144876538" Mar 20 16:20:47 crc kubenswrapper[4708]: I0320 16:20:47.510443 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:20:48 crc kubenswrapper[4708]: I0320 16:20:48.564907 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="1da957bf-1f80-4bef-9033-333fa60118c3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 20 16:20:49 crc kubenswrapper[4708]: I0320 16:20:49.004838 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="103fe6f4-2ac5-430b-9ce4-2d142b273674" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 20 16:20:49 crc kubenswrapper[4708]: I0320 16:20:49.389172 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:49 crc kubenswrapper[4708]: E0320 16:20:49.389355 4708 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 16:20:49 crc kubenswrapper[4708]: E0320 16:20:49.389368 4708 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 16:20:49 crc kubenswrapper[4708]: E0320 16:20:49.389411 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift podName:b5b88259-a142-42a0-ab2c-bb0980ad9465 nodeName:}" failed. No retries permitted until 2026-03-20 16:20:53.389396384 +0000 UTC m=+1208.063733099 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift") pod "swift-storage-0" (UID: "b5b88259-a142-42a0-ab2c-bb0980ad9465") : configmap "swift-ring-files" not found Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.498990 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-w47cd" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.705254 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-22qr2-config-ghmvr"] Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.706353 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.708171 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.721306 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-22qr2-config-ghmvr"] Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.770810 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/339d5b33-42f5-48e9-9420-a6ebc8a87a70-additional-scripts\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.771231 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-run\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.771427 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/339d5b33-42f5-48e9-9420-a6ebc8a87a70-scripts\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.771693 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kcj5\" (UniqueName: \"kubernetes.io/projected/339d5b33-42f5-48e9-9420-a6ebc8a87a70-kube-api-access-2kcj5\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.771727 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-log-ovn\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.771831 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-run-ovn\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.873727 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/339d5b33-42f5-48e9-9420-a6ebc8a87a70-additional-scripts\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.873825 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-run\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.873864 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/339d5b33-42f5-48e9-9420-a6ebc8a87a70-scripts\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.873922 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-log-ovn\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.873951 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kcj5\" (UniqueName: \"kubernetes.io/projected/339d5b33-42f5-48e9-9420-a6ebc8a87a70-kube-api-access-2kcj5\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.874002 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-run-ovn\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.874227 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-log-ovn\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.874271 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-run-ovn\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.874687 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-run\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.876243 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/339d5b33-42f5-48e9-9420-a6ebc8a87a70-scripts\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.877175 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/339d5b33-42f5-48e9-9420-a6ebc8a87a70-additional-scripts\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:52 crc kubenswrapper[4708]: I0320 16:20:52.898009 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kcj5\" (UniqueName: \"kubernetes.io/projected/339d5b33-42f5-48e9-9420-a6ebc8a87a70-kube-api-access-2kcj5\") pod \"ovn-controller-22qr2-config-ghmvr\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:53 crc kubenswrapper[4708]: I0320 16:20:53.033114 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:20:53 crc kubenswrapper[4708]: I0320 16:20:53.483922 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:20:53 crc kubenswrapper[4708]: E0320 16:20:53.484140 4708 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 16:20:53 crc kubenswrapper[4708]: E0320 16:20:53.484157 4708 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 16:20:53 crc kubenswrapper[4708]: E0320 16:20:53.484292 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift podName:b5b88259-a142-42a0-ab2c-bb0980ad9465 nodeName:}" failed. No retries permitted until 2026-03-20 16:21:01.484269937 +0000 UTC m=+1216.158606672 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift") pod "swift-storage-0" (UID: "b5b88259-a142-42a0-ab2c-bb0980ad9465") : configmap "swift-ring-files" not found Mar 20 16:20:54 crc kubenswrapper[4708]: I0320 16:20:54.086755 4708 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod05ccae0c-99a6-4534-865d-65b3a96a5832"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod05ccae0c-99a6-4534-865d-65b3a96a5832] : Timed out while waiting for systemd to remove kubepods-besteffort-pod05ccae0c_99a6_4534_865d_65b3a96a5832.slice" Mar 20 16:20:54 crc kubenswrapper[4708]: E0320 16:20:54.086804 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod05ccae0c-99a6-4534-865d-65b3a96a5832] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod05ccae0c-99a6-4534-865d-65b3a96a5832] : Timed out while waiting for systemd to remove kubepods-besteffort-pod05ccae0c_99a6_4534_865d_65b3a96a5832.slice" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" podUID="05ccae0c-99a6-4534-865d-65b3a96a5832" Mar 20 16:20:54 crc kubenswrapper[4708]: I0320 16:20:54.579571 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-fzv8c" Mar 20 16:20:54 crc kubenswrapper[4708]: I0320 16:20:54.620192 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-fzv8c"] Mar 20 16:20:54 crc kubenswrapper[4708]: I0320 16:20:54.629817 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-fzv8c"] Mar 20 16:20:54 crc kubenswrapper[4708]: I0320 16:20:54.762831 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:20:54 crc kubenswrapper[4708]: I0320 16:20:54.839287 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-z5mdd"] Mar 20 16:20:54 crc kubenswrapper[4708]: I0320 16:20:54.839561 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" podUID="b37996e7-c038-4444-bbe4-df1f32b2b029" containerName="dnsmasq-dns" containerID="cri-o://bdff7d3822cd20ff89be1e41d6d9d9ade14fd0195e6ee91e3526461e0478479c" gracePeriod=10 Mar 20 16:20:55 crc kubenswrapper[4708]: I0320 16:20:55.589304 4708 generic.go:334] "Generic (PLEG): container finished" podID="b37996e7-c038-4444-bbe4-df1f32b2b029" containerID="bdff7d3822cd20ff89be1e41d6d9d9ade14fd0195e6ee91e3526461e0478479c" exitCode=0 Mar 20 16:20:55 crc kubenswrapper[4708]: I0320 16:20:55.589656 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" event={"ID":"b37996e7-c038-4444-bbe4-df1f32b2b029","Type":"ContainerDied","Data":"bdff7d3822cd20ff89be1e41d6d9d9ade14fd0195e6ee91e3526461e0478479c"} Mar 20 16:20:56 crc kubenswrapper[4708]: I0320 16:20:56.122507 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05ccae0c-99a6-4534-865d-65b3a96a5832" path="/var/lib/kubelet/pods/05ccae0c-99a6-4534-865d-65b3a96a5832/volumes" Mar 20 16:20:56 crc kubenswrapper[4708]: I0320 16:20:56.178123 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:20:56 crc kubenswrapper[4708]: I0320 16:20:56.178190 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:20:56 crc kubenswrapper[4708]: I0320 16:20:56.178241 4708 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:20:56 crc kubenswrapper[4708]: I0320 16:20:56.179073 4708 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0d736f2e0fcb77d6a0bd7e7c40db6605c442763cf48cb3f0e13e50d606ae696"} pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:20:56 crc kubenswrapper[4708]: I0320 16:20:56.179137 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" containerID="cri-o://a0d736f2e0fcb77d6a0bd7e7c40db6605c442763cf48cb3f0e13e50d606ae696" gracePeriod=600 Mar 20 16:20:56 crc kubenswrapper[4708]: I0320 16:20:56.599929 4708 generic.go:334] "Generic (PLEG): container finished" podID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerID="a0d736f2e0fcb77d6a0bd7e7c40db6605c442763cf48cb3f0e13e50d606ae696" exitCode=0 Mar 20 16:20:56 crc kubenswrapper[4708]: I0320 16:20:56.599968 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerDied","Data":"a0d736f2e0fcb77d6a0bd7e7c40db6605c442763cf48cb3f0e13e50d606ae696"} Mar 20 16:20:56 crc kubenswrapper[4708]: I0320 16:20:56.599999 4708 scope.go:117] "RemoveContainer" containerID="b96515fa4390968ae59bbefc52b318ed0531df03cbc4138681a6e2acfc16a612" Mar 20 16:20:57 crc kubenswrapper[4708]: I0320 16:20:57.752772 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-22qr2" podUID="8ad8d5cb-c681-406d-8dee-25f0a0f71b83" containerName="ovn-controller" probeResult="failure" output=< Mar 20 16:20:57 crc kubenswrapper[4708]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 16:20:57 crc kubenswrapper[4708]: > Mar 20 16:20:58 crc kubenswrapper[4708]: I0320 16:20:58.198755 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" podUID="b37996e7-c038-4444-bbe4-df1f32b2b029" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Mar 20 16:20:58 crc kubenswrapper[4708]: I0320 16:20:58.559957 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 16:20:58 crc kubenswrapper[4708]: I0320 16:20:58.902149 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qnn97"] Mar 20 16:20:58 crc kubenswrapper[4708]: I0320 16:20:58.903512 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qnn97" Mar 20 16:20:58 crc kubenswrapper[4708]: I0320 16:20:58.920370 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qnn97"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.000923 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26ppp\" (UniqueName: \"kubernetes.io/projected/15badc78-5ab8-41aa-acfb-4bb1f28bcbab-kube-api-access-26ppp\") pod \"cinder-db-create-qnn97\" (UID: \"15badc78-5ab8-41aa-acfb-4bb1f28bcbab\") " pod="openstack/cinder-db-create-qnn97" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.001020 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15badc78-5ab8-41aa-acfb-4bb1f28bcbab-operator-scripts\") pod \"cinder-db-create-qnn97\" (UID: \"15badc78-5ab8-41aa-acfb-4bb1f28bcbab\") " pod="openstack/cinder-db-create-qnn97" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.004857 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.048765 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-42e7-account-create-update-jmvzr"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.049901 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-42e7-account-create-update-jmvzr" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.054886 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.094385 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-42e7-account-create-update-jmvzr"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.102368 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brzct\" (UniqueName: \"kubernetes.io/projected/949acc8b-4603-4567-9864-0328462133a0-kube-api-access-brzct\") pod \"cinder-42e7-account-create-update-jmvzr\" (UID: \"949acc8b-4603-4567-9864-0328462133a0\") " pod="openstack/cinder-42e7-account-create-update-jmvzr" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.102461 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26ppp\" (UniqueName: \"kubernetes.io/projected/15badc78-5ab8-41aa-acfb-4bb1f28bcbab-kube-api-access-26ppp\") pod \"cinder-db-create-qnn97\" (UID: \"15badc78-5ab8-41aa-acfb-4bb1f28bcbab\") " pod="openstack/cinder-db-create-qnn97" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.102536 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15badc78-5ab8-41aa-acfb-4bb1f28bcbab-operator-scripts\") pod \"cinder-db-create-qnn97\" (UID: \"15badc78-5ab8-41aa-acfb-4bb1f28bcbab\") " pod="openstack/cinder-db-create-qnn97" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.102598 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949acc8b-4603-4567-9864-0328462133a0-operator-scripts\") pod \"cinder-42e7-account-create-update-jmvzr\" (UID: \"949acc8b-4603-4567-9864-0328462133a0\") " pod="openstack/cinder-42e7-account-create-update-jmvzr" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.104797 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15badc78-5ab8-41aa-acfb-4bb1f28bcbab-operator-scripts\") pod \"cinder-db-create-qnn97\" (UID: \"15badc78-5ab8-41aa-acfb-4bb1f28bcbab\") " pod="openstack/cinder-db-create-qnn97" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.131332 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26ppp\" (UniqueName: \"kubernetes.io/projected/15badc78-5ab8-41aa-acfb-4bb1f28bcbab-kube-api-access-26ppp\") pod \"cinder-db-create-qnn97\" (UID: \"15badc78-5ab8-41aa-acfb-4bb1f28bcbab\") " pod="openstack/cinder-db-create-qnn97" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.205568 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brzct\" (UniqueName: \"kubernetes.io/projected/949acc8b-4603-4567-9864-0328462133a0-kube-api-access-brzct\") pod \"cinder-42e7-account-create-update-jmvzr\" (UID: \"949acc8b-4603-4567-9864-0328462133a0\") " pod="openstack/cinder-42e7-account-create-update-jmvzr" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.206031 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949acc8b-4603-4567-9864-0328462133a0-operator-scripts\") pod \"cinder-42e7-account-create-update-jmvzr\" (UID: \"949acc8b-4603-4567-9864-0328462133a0\") " pod="openstack/cinder-42e7-account-create-update-jmvzr" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.206798 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949acc8b-4603-4567-9864-0328462133a0-operator-scripts\") pod \"cinder-42e7-account-create-update-jmvzr\" (UID: \"949acc8b-4603-4567-9864-0328462133a0\") " pod="openstack/cinder-42e7-account-create-update-jmvzr" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.211401 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-z2bxx"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.212756 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.216986 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vtmgw" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.217263 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.223345 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.224527 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qnn97" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.224839 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.240465 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-z2bxx"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.254294 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brzct\" (UniqueName: \"kubernetes.io/projected/949acc8b-4603-4567-9864-0328462133a0-kube-api-access-brzct\") pod \"cinder-42e7-account-create-update-jmvzr\" (UID: \"949acc8b-4603-4567-9864-0328462133a0\") " pod="openstack/cinder-42e7-account-create-update-jmvzr" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.338053 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-nbvxk"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.342530 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nbvxk" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.365457 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-42e7-account-create-update-jmvzr" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.402924 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nbvxk"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.414929 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6jvn\" (UniqueName: \"kubernetes.io/projected/8d5e7b3a-c1c7-493a-a587-19d751f038be-kube-api-access-p6jvn\") pod \"keystone-db-sync-z2bxx\" (UID: \"8d5e7b3a-c1c7-493a-a587-19d751f038be\") " pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.415600 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d5e7b3a-c1c7-493a-a587-19d751f038be-config-data\") pod \"keystone-db-sync-z2bxx\" (UID: \"8d5e7b3a-c1c7-493a-a587-19d751f038be\") " pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.415720 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5e7b3a-c1c7-493a-a587-19d751f038be-combined-ca-bundle\") pod \"keystone-db-sync-z2bxx\" (UID: \"8d5e7b3a-c1c7-493a-a587-19d751f038be\") " pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.450292 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-dg24d"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.452042 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dg24d" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.456914 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5805-account-create-update-2qn7m"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.459567 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5805-account-create-update-2qn7m" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.470262 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.482997 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dg24d"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.496292 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5805-account-create-update-2qn7m"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.517222 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26pw2\" (UniqueName: \"kubernetes.io/projected/1b15f38d-ded0-4fed-add0-c891d2208014-kube-api-access-26pw2\") pod \"neutron-db-create-dg24d\" (UID: \"1b15f38d-ded0-4fed-add0-c891d2208014\") " pod="openstack/neutron-db-create-dg24d" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.517270 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vk6t\" (UniqueName: \"kubernetes.io/projected/3a9a8c73-828e-42ef-9818-6aab510e8240-kube-api-access-6vk6t\") pod \"barbican-db-create-nbvxk\" (UID: \"3a9a8c73-828e-42ef-9818-6aab510e8240\") " pod="openstack/barbican-db-create-nbvxk" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.517305 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6jvn\" (UniqueName: \"kubernetes.io/projected/8d5e7b3a-c1c7-493a-a587-19d751f038be-kube-api-access-p6jvn\") pod \"keystone-db-sync-z2bxx\" (UID: \"8d5e7b3a-c1c7-493a-a587-19d751f038be\") " pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.517405 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e28ace2-9e11-4223-b58d-91688cd2ced4-operator-scripts\") pod \"barbican-5805-account-create-update-2qn7m\" (UID: \"4e28ace2-9e11-4223-b58d-91688cd2ced4\") " pod="openstack/barbican-5805-account-create-update-2qn7m" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.517505 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d5e7b3a-c1c7-493a-a587-19d751f038be-config-data\") pod \"keystone-db-sync-z2bxx\" (UID: \"8d5e7b3a-c1c7-493a-a587-19d751f038be\") " pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.517576 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5e7b3a-c1c7-493a-a587-19d751f038be-combined-ca-bundle\") pod \"keystone-db-sync-z2bxx\" (UID: \"8d5e7b3a-c1c7-493a-a587-19d751f038be\") " pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.518602 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmwr8\" (UniqueName: \"kubernetes.io/projected/4e28ace2-9e11-4223-b58d-91688cd2ced4-kube-api-access-mmwr8\") pod \"barbican-5805-account-create-update-2qn7m\" (UID: \"4e28ace2-9e11-4223-b58d-91688cd2ced4\") " pod="openstack/barbican-5805-account-create-update-2qn7m" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.518737 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b15f38d-ded0-4fed-add0-c891d2208014-operator-scripts\") pod \"neutron-db-create-dg24d\" (UID: \"1b15f38d-ded0-4fed-add0-c891d2208014\") " pod="openstack/neutron-db-create-dg24d" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.518801 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a9a8c73-828e-42ef-9818-6aab510e8240-operator-scripts\") pod \"barbican-db-create-nbvxk\" (UID: \"3a9a8c73-828e-42ef-9818-6aab510e8240\") " pod="openstack/barbican-db-create-nbvxk" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.523379 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d5e7b3a-c1c7-493a-a587-19d751f038be-config-data\") pod \"keystone-db-sync-z2bxx\" (UID: \"8d5e7b3a-c1c7-493a-a587-19d751f038be\") " pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.524730 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5e7b3a-c1c7-493a-a587-19d751f038be-combined-ca-bundle\") pod \"keystone-db-sync-z2bxx\" (UID: \"8d5e7b3a-c1c7-493a-a587-19d751f038be\") " pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.534085 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6jvn\" (UniqueName: \"kubernetes.io/projected/8d5e7b3a-c1c7-493a-a587-19d751f038be-kube-api-access-p6jvn\") pod \"keystone-db-sync-z2bxx\" (UID: \"8d5e7b3a-c1c7-493a-a587-19d751f038be\") " pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.597331 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.620065 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmwr8\" (UniqueName: \"kubernetes.io/projected/4e28ace2-9e11-4223-b58d-91688cd2ced4-kube-api-access-mmwr8\") pod \"barbican-5805-account-create-update-2qn7m\" (UID: \"4e28ace2-9e11-4223-b58d-91688cd2ced4\") " pod="openstack/barbican-5805-account-create-update-2qn7m" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.620147 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b15f38d-ded0-4fed-add0-c891d2208014-operator-scripts\") pod \"neutron-db-create-dg24d\" (UID: \"1b15f38d-ded0-4fed-add0-c891d2208014\") " pod="openstack/neutron-db-create-dg24d" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.620170 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a9a8c73-828e-42ef-9818-6aab510e8240-operator-scripts\") pod \"barbican-db-create-nbvxk\" (UID: \"3a9a8c73-828e-42ef-9818-6aab510e8240\") " pod="openstack/barbican-db-create-nbvxk" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.620217 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26pw2\" (UniqueName: \"kubernetes.io/projected/1b15f38d-ded0-4fed-add0-c891d2208014-kube-api-access-26pw2\") pod \"neutron-db-create-dg24d\" (UID: \"1b15f38d-ded0-4fed-add0-c891d2208014\") " pod="openstack/neutron-db-create-dg24d" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.620243 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vk6t\" (UniqueName: \"kubernetes.io/projected/3a9a8c73-828e-42ef-9818-6aab510e8240-kube-api-access-6vk6t\") pod \"barbican-db-create-nbvxk\" (UID: \"3a9a8c73-828e-42ef-9818-6aab510e8240\") " pod="openstack/barbican-db-create-nbvxk" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.620270 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e28ace2-9e11-4223-b58d-91688cd2ced4-operator-scripts\") pod \"barbican-5805-account-create-update-2qn7m\" (UID: \"4e28ace2-9e11-4223-b58d-91688cd2ced4\") " pod="openstack/barbican-5805-account-create-update-2qn7m" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.620981 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e28ace2-9e11-4223-b58d-91688cd2ced4-operator-scripts\") pod \"barbican-5805-account-create-update-2qn7m\" (UID: \"4e28ace2-9e11-4223-b58d-91688cd2ced4\") " pod="openstack/barbican-5805-account-create-update-2qn7m" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.621867 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a9a8c73-828e-42ef-9818-6aab510e8240-operator-scripts\") pod \"barbican-db-create-nbvxk\" (UID: \"3a9a8c73-828e-42ef-9818-6aab510e8240\") " pod="openstack/barbican-db-create-nbvxk" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.621947 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b15f38d-ded0-4fed-add0-c891d2208014-operator-scripts\") pod \"neutron-db-create-dg24d\" (UID: \"1b15f38d-ded0-4fed-add0-c891d2208014\") " pod="openstack/neutron-db-create-dg24d" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.634956 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5bed-account-create-update-8dhzm"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.636758 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bed-account-create-update-8dhzm" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.639149 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.641340 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26pw2\" (UniqueName: \"kubernetes.io/projected/1b15f38d-ded0-4fed-add0-c891d2208014-kube-api-access-26pw2\") pod \"neutron-db-create-dg24d\" (UID: \"1b15f38d-ded0-4fed-add0-c891d2208014\") " pod="openstack/neutron-db-create-dg24d" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.641889 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vk6t\" (UniqueName: \"kubernetes.io/projected/3a9a8c73-828e-42ef-9818-6aab510e8240-kube-api-access-6vk6t\") pod \"barbican-db-create-nbvxk\" (UID: \"3a9a8c73-828e-42ef-9818-6aab510e8240\") " pod="openstack/barbican-db-create-nbvxk" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.644144 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmwr8\" (UniqueName: \"kubernetes.io/projected/4e28ace2-9e11-4223-b58d-91688cd2ced4-kube-api-access-mmwr8\") pod \"barbican-5805-account-create-update-2qn7m\" (UID: \"4e28ace2-9e11-4223-b58d-91688cd2ced4\") " pod="openstack/barbican-5805-account-create-update-2qn7m" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.652914 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bed-account-create-update-8dhzm"] Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.714961 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nbvxk" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.721804 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/247ec5aa-1401-422a-b9f7-71c8c9b4876e-operator-scripts\") pod \"neutron-5bed-account-create-update-8dhzm\" (UID: \"247ec5aa-1401-422a-b9f7-71c8c9b4876e\") " pod="openstack/neutron-5bed-account-create-update-8dhzm" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.721850 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9glz5\" (UniqueName: \"kubernetes.io/projected/247ec5aa-1401-422a-b9f7-71c8c9b4876e-kube-api-access-9glz5\") pod \"neutron-5bed-account-create-update-8dhzm\" (UID: \"247ec5aa-1401-422a-b9f7-71c8c9b4876e\") " pod="openstack/neutron-5bed-account-create-update-8dhzm" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.790628 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dg24d" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.805763 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5805-account-create-update-2qn7m" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.824531 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/247ec5aa-1401-422a-b9f7-71c8c9b4876e-operator-scripts\") pod \"neutron-5bed-account-create-update-8dhzm\" (UID: \"247ec5aa-1401-422a-b9f7-71c8c9b4876e\") " pod="openstack/neutron-5bed-account-create-update-8dhzm" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.824577 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9glz5\" (UniqueName: \"kubernetes.io/projected/247ec5aa-1401-422a-b9f7-71c8c9b4876e-kube-api-access-9glz5\") pod \"neutron-5bed-account-create-update-8dhzm\" (UID: \"247ec5aa-1401-422a-b9f7-71c8c9b4876e\") " pod="openstack/neutron-5bed-account-create-update-8dhzm" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.825705 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/247ec5aa-1401-422a-b9f7-71c8c9b4876e-operator-scripts\") pod \"neutron-5bed-account-create-update-8dhzm\" (UID: \"247ec5aa-1401-422a-b9f7-71c8c9b4876e\") " pod="openstack/neutron-5bed-account-create-update-8dhzm" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.843946 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9glz5\" (UniqueName: \"kubernetes.io/projected/247ec5aa-1401-422a-b9f7-71c8c9b4876e-kube-api-access-9glz5\") pod \"neutron-5bed-account-create-update-8dhzm\" (UID: \"247ec5aa-1401-422a-b9f7-71c8c9b4876e\") " pod="openstack/neutron-5bed-account-create-update-8dhzm" Mar 20 16:20:59 crc kubenswrapper[4708]: I0320 16:20:59.993042 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bed-account-create-update-8dhzm" Mar 20 16:21:01 crc kubenswrapper[4708]: I0320 16:21:01.554591 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:21:01 crc kubenswrapper[4708]: E0320 16:21:01.554920 4708 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 16:21:01 crc kubenswrapper[4708]: E0320 16:21:01.555757 4708 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 16:21:01 crc kubenswrapper[4708]: E0320 16:21:01.555885 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift podName:b5b88259-a142-42a0-ab2c-bb0980ad9465 nodeName:}" failed. No retries permitted until 2026-03-20 16:21:17.555853617 +0000 UTC m=+1232.230190342 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift") pod "swift-storage-0" (UID: "b5b88259-a142-42a0-ab2c-bb0980ad9465") : configmap "swift-ring-files" not found Mar 20 16:21:02 crc kubenswrapper[4708]: E0320 16:21:02.685790 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Mar 20 16:21:02 crc kubenswrapper[4708]: E0320 16:21:02.686430 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4jkq7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-rpnp8_openstack(e73e6a53-ccd4-45bf-ad96-a6de1e696888): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:21:02 crc kubenswrapper[4708]: E0320 16:21:02.691564 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-rpnp8" podUID="e73e6a53-ccd4-45bf-ad96-a6de1e696888" Mar 20 16:21:02 crc kubenswrapper[4708]: I0320 16:21:02.807501 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-22qr2" podUID="8ad8d5cb-c681-406d-8dee-25f0a0f71b83" containerName="ovn-controller" probeResult="failure" output=< Mar 20 16:21:02 crc kubenswrapper[4708]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 16:21:02 crc kubenswrapper[4708]: > Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.106476 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.108895 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-ovsdbserver-nb\") pod \"b37996e7-c038-4444-bbe4-df1f32b2b029\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.108938 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-dns-svc\") pod \"b37996e7-c038-4444-bbe4-df1f32b2b029\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.108989 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-config\") pod \"b37996e7-c038-4444-bbe4-df1f32b2b029\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.109020 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz6c2\" (UniqueName: \"kubernetes.io/projected/b37996e7-c038-4444-bbe4-df1f32b2b029-kube-api-access-sz6c2\") pod \"b37996e7-c038-4444-bbe4-df1f32b2b029\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.109080 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-ovsdbserver-sb\") pod \"b37996e7-c038-4444-bbe4-df1f32b2b029\" (UID: \"b37996e7-c038-4444-bbe4-df1f32b2b029\") " Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.147336 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b37996e7-c038-4444-bbe4-df1f32b2b029-kube-api-access-sz6c2" (OuterVolumeSpecName: "kube-api-access-sz6c2") pod "b37996e7-c038-4444-bbe4-df1f32b2b029" (UID: "b37996e7-c038-4444-bbe4-df1f32b2b029"). InnerVolumeSpecName "kube-api-access-sz6c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.211873 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz6c2\" (UniqueName: \"kubernetes.io/projected/b37996e7-c038-4444-bbe4-df1f32b2b029-kube-api-access-sz6c2\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.382001 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-config" (OuterVolumeSpecName: "config") pod "b37996e7-c038-4444-bbe4-df1f32b2b029" (UID: "b37996e7-c038-4444-bbe4-df1f32b2b029"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.390014 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b37996e7-c038-4444-bbe4-df1f32b2b029" (UID: "b37996e7-c038-4444-bbe4-df1f32b2b029"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.396642 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b37996e7-c038-4444-bbe4-df1f32b2b029" (UID: "b37996e7-c038-4444-bbe4-df1f32b2b029"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.402492 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b37996e7-c038-4444-bbe4-df1f32b2b029" (UID: "b37996e7-c038-4444-bbe4-df1f32b2b029"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.416143 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.416182 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.416191 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.416202 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37996e7-c038-4444-bbe4-df1f32b2b029-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.428560 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-22qr2-config-ghmvr"] Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.466248 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5805-account-create-update-2qn7m"] Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.587800 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bed-account-create-update-8dhzm"] Mar 20 16:21:03 crc kubenswrapper[4708]: W0320 16:21:03.595349 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod247ec5aa_1401_422a_b9f7_71c8c9b4876e.slice/crio-78905953a09515201b7d7544117f6e9eb88f7ab20185c4a73ed0abf10751bb89 WatchSource:0}: Error finding container 78905953a09515201b7d7544117f6e9eb88f7ab20185c4a73ed0abf10751bb89: Status 404 returned error can't find the container with id 78905953a09515201b7d7544117f6e9eb88f7ab20185c4a73ed0abf10751bb89 Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.618502 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nbvxk"] Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.701945 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerStarted","Data":"1802f9b5863d50f9b0059b425a0ca397c3796b016f71b5db8c43776fd2853ecb"} Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.711975 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" event={"ID":"b37996e7-c038-4444-bbe4-df1f32b2b029","Type":"ContainerDied","Data":"eab8842d016fd070bdb526a82a1fb4dcb62a6d92c322c60d2c63d0c2d066c971"} Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.712058 4708 scope.go:117] "RemoveContainer" containerID="bdff7d3822cd20ff89be1e41d6d9d9ade14fd0195e6ee91e3526461e0478479c" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.711998 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-z5mdd" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.716002 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s6247" event={"ID":"ad381906-1b90-4230-a435-9ed844232ba1","Type":"ContainerStarted","Data":"86f51f8e74439206ebaa11043efc755d838b724e0976d43c4481fc6fc7aaf3f4"} Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.718325 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nbvxk" event={"ID":"3a9a8c73-828e-42ef-9818-6aab510e8240","Type":"ContainerStarted","Data":"6bcd0c950fa1e8753477b0047ffec120b1329dd7d3946acf217a67527081ffb6"} Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.720861 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bed-account-create-update-8dhzm" event={"ID":"247ec5aa-1401-422a-b9f7-71c8c9b4876e","Type":"ContainerStarted","Data":"78905953a09515201b7d7544117f6e9eb88f7ab20185c4a73ed0abf10751bb89"} Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.722145 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5805-account-create-update-2qn7m" event={"ID":"4e28ace2-9e11-4223-b58d-91688cd2ced4","Type":"ContainerStarted","Data":"317905a8e042638c9a6a84eb3de76e71f19b23e7e8391b29d27b4d5baa308467"} Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.722172 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5805-account-create-update-2qn7m" event={"ID":"4e28ace2-9e11-4223-b58d-91688cd2ced4","Type":"ContainerStarted","Data":"fe5708110fc8b898662531d88578cfca7e6d10486496a69ed9002beb2563e43a"} Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.726265 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-22qr2-config-ghmvr" event={"ID":"339d5b33-42f5-48e9-9420-a6ebc8a87a70","Type":"ContainerStarted","Data":"8fb8a196ae455134ea5c722b57c93388760dccc3a1a6e9348a15067004e7b566"} Mar 20 16:21:03 crc kubenswrapper[4708]: E0320 16:21:03.728779 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-rpnp8" podUID="e73e6a53-ccd4-45bf-ad96-a6de1e696888" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.760524 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-z2bxx"] Mar 20 16:21:03 crc kubenswrapper[4708]: W0320 16:21:03.765682 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod949acc8b_4603_4567_9864_0328462133a0.slice/crio-66573adc26884fe9ac19112b37dd18c3a58b65282f31909ea6ec59d3f04c21c0 WatchSource:0}: Error finding container 66573adc26884fe9ac19112b37dd18c3a58b65282f31909ea6ec59d3f04c21c0: Status 404 returned error can't find the container with id 66573adc26884fe9ac19112b37dd18c3a58b65282f31909ea6ec59d3f04c21c0 Mar 20 16:21:03 crc kubenswrapper[4708]: W0320 16:21:03.768582 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d5e7b3a_c1c7_493a_a587_19d751f038be.slice/crio-206010f36a96475fb25849a00d97c70ac88e4e03d43f77787adf34f3379c1620 WatchSource:0}: Error finding container 206010f36a96475fb25849a00d97c70ac88e4e03d43f77787adf34f3379c1620: Status 404 returned error can't find the container with id 206010f36a96475fb25849a00d97c70ac88e4e03d43f77787adf34f3379c1620 Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.778328 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-42e7-account-create-update-jmvzr"] Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.796497 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-s6247" podStartSLOduration=2.214608363 podStartE2EDuration="17.796476286s" podCreationTimestamp="2026-03-20 16:20:46 +0000 UTC" firstStartedPulling="2026-03-20 16:20:47.141244189 +0000 UTC m=+1201.815580904" lastFinishedPulling="2026-03-20 16:21:02.723112102 +0000 UTC m=+1217.397448827" observedRunningTime="2026-03-20 16:21:03.756179133 +0000 UTC m=+1218.430515848" watchObservedRunningTime="2026-03-20 16:21:03.796476286 +0000 UTC m=+1218.470812991" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.802607 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-5805-account-create-update-2qn7m" podStartSLOduration=4.802589204 podStartE2EDuration="4.802589204s" podCreationTimestamp="2026-03-20 16:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:03.77544363 +0000 UTC m=+1218.449780345" watchObservedRunningTime="2026-03-20 16:21:03.802589204 +0000 UTC m=+1218.476925929" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.864079 4708 scope.go:117] "RemoveContainer" containerID="3a3e2de04ed18996385382077fefdb5cc50aec60910d715ea1ac23a7e18ad66e" Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.873122 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qnn97"] Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.881532 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-z5mdd"] Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.893758 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-z5mdd"] Mar 20 16:21:03 crc kubenswrapper[4708]: W0320 16:21:03.902897 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b15f38d_ded0_4fed_add0_c891d2208014.slice/crio-5b822e364bffae8f871b75e94898897edd22d5cbfba0c637a2066588f21a4c1b WatchSource:0}: Error finding container 5b822e364bffae8f871b75e94898897edd22d5cbfba0c637a2066588f21a4c1b: Status 404 returned error can't find the container with id 5b822e364bffae8f871b75e94898897edd22d5cbfba0c637a2066588f21a4c1b Mar 20 16:21:03 crc kubenswrapper[4708]: I0320 16:21:03.903825 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dg24d"] Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.126439 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b37996e7-c038-4444-bbe4-df1f32b2b029" path="/var/lib/kubelet/pods/b37996e7-c038-4444-bbe4-df1f32b2b029/volumes" Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.736859 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-z2bxx" event={"ID":"8d5e7b3a-c1c7-493a-a587-19d751f038be","Type":"ContainerStarted","Data":"206010f36a96475fb25849a00d97c70ac88e4e03d43f77787adf34f3379c1620"} Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.738567 4708 generic.go:334] "Generic (PLEG): container finished" podID="949acc8b-4603-4567-9864-0328462133a0" containerID="b5a50a5b9ae667b828ddcee663d732f69dd3e8ff3e48202185ac4b1a3015e7d0" exitCode=0 Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.738663 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-42e7-account-create-update-jmvzr" event={"ID":"949acc8b-4603-4567-9864-0328462133a0","Type":"ContainerDied","Data":"b5a50a5b9ae667b828ddcee663d732f69dd3e8ff3e48202185ac4b1a3015e7d0"} Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.738730 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-42e7-account-create-update-jmvzr" event={"ID":"949acc8b-4603-4567-9864-0328462133a0","Type":"ContainerStarted","Data":"66573adc26884fe9ac19112b37dd18c3a58b65282f31909ea6ec59d3f04c21c0"} Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.740417 4708 generic.go:334] "Generic (PLEG): container finished" podID="247ec5aa-1401-422a-b9f7-71c8c9b4876e" containerID="21f2c20e02291265dbd53b60bc1e391f02382442da3143ab1abc187e58b03aed" exitCode=0 Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.740472 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bed-account-create-update-8dhzm" event={"ID":"247ec5aa-1401-422a-b9f7-71c8c9b4876e","Type":"ContainerDied","Data":"21f2c20e02291265dbd53b60bc1e391f02382442da3143ab1abc187e58b03aed"} Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.749343 4708 generic.go:334] "Generic (PLEG): container finished" podID="3a9a8c73-828e-42ef-9818-6aab510e8240" containerID="82f81a2906e9aafa474771012c56f045dcf7464bb996d8872ecd4aff4e0da9ec" exitCode=0 Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.749418 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nbvxk" event={"ID":"3a9a8c73-828e-42ef-9818-6aab510e8240","Type":"ContainerDied","Data":"82f81a2906e9aafa474771012c56f045dcf7464bb996d8872ecd4aff4e0da9ec"} Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.751214 4708 generic.go:334] "Generic (PLEG): container finished" podID="1b15f38d-ded0-4fed-add0-c891d2208014" containerID="075295be7d5f1e1262933d9308b0c5783279ff6465320622f55a2f0933c7df02" exitCode=0 Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.751266 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dg24d" event={"ID":"1b15f38d-ded0-4fed-add0-c891d2208014","Type":"ContainerDied","Data":"075295be7d5f1e1262933d9308b0c5783279ff6465320622f55a2f0933c7df02"} Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.751284 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dg24d" event={"ID":"1b15f38d-ded0-4fed-add0-c891d2208014","Type":"ContainerStarted","Data":"5b822e364bffae8f871b75e94898897edd22d5cbfba0c637a2066588f21a4c1b"} Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.756367 4708 generic.go:334] "Generic (PLEG): container finished" podID="4e28ace2-9e11-4223-b58d-91688cd2ced4" containerID="317905a8e042638c9a6a84eb3de76e71f19b23e7e8391b29d27b4d5baa308467" exitCode=0 Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.756561 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5805-account-create-update-2qn7m" event={"ID":"4e28ace2-9e11-4223-b58d-91688cd2ced4","Type":"ContainerDied","Data":"317905a8e042638c9a6a84eb3de76e71f19b23e7e8391b29d27b4d5baa308467"} Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.766980 4708 generic.go:334] "Generic (PLEG): container finished" podID="339d5b33-42f5-48e9-9420-a6ebc8a87a70" containerID="d0f141d43349a6b8e3ce00c7e3d44db4e425e7d8f794c14e2a3919f139885855" exitCode=0 Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.767120 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-22qr2-config-ghmvr" event={"ID":"339d5b33-42f5-48e9-9420-a6ebc8a87a70","Type":"ContainerDied","Data":"d0f141d43349a6b8e3ce00c7e3d44db4e425e7d8f794c14e2a3919f139885855"} Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.770845 4708 generic.go:334] "Generic (PLEG): container finished" podID="15badc78-5ab8-41aa-acfb-4bb1f28bcbab" containerID="f0a2395cbb3be1be4591e8307948992da95749400177c63c6be783deffd7426a" exitCode=0 Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.770988 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qnn97" event={"ID":"15badc78-5ab8-41aa-acfb-4bb1f28bcbab","Type":"ContainerDied","Data":"f0a2395cbb3be1be4591e8307948992da95749400177c63c6be783deffd7426a"} Mar 20 16:21:04 crc kubenswrapper[4708]: I0320 16:21:04.771027 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qnn97" event={"ID":"15badc78-5ab8-41aa-acfb-4bb1f28bcbab","Type":"ContainerStarted","Data":"a1a296b47ae39b44644af3f619c5ec2415ec1bd7c7d03b0fe89748a20056591b"} Mar 20 16:21:07 crc kubenswrapper[4708]: I0320 16:21:07.749349 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-22qr2" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.167878 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nbvxk" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.177393 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.192143 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bed-account-create-update-8dhzm" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.219275 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qnn97" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.228250 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5805-account-create-update-2qn7m" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.249190 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dg24d" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.253960 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-42e7-account-create-update-jmvzr" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.254132 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-run-ovn\") pod \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.254192 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-run\") pod \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.254218 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/339d5b33-42f5-48e9-9420-a6ebc8a87a70-scripts\") pod \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.254248 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/339d5b33-42f5-48e9-9420-a6ebc8a87a70-additional-scripts\") pod \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.254264 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-log-ovn\") pod \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.254285 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9glz5\" (UniqueName: \"kubernetes.io/projected/247ec5aa-1401-422a-b9f7-71c8c9b4876e-kube-api-access-9glz5\") pod \"247ec5aa-1401-422a-b9f7-71c8c9b4876e\" (UID: \"247ec5aa-1401-422a-b9f7-71c8c9b4876e\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.254331 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kcj5\" (UniqueName: \"kubernetes.io/projected/339d5b33-42f5-48e9-9420-a6ebc8a87a70-kube-api-access-2kcj5\") pod \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\" (UID: \"339d5b33-42f5-48e9-9420-a6ebc8a87a70\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.254354 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/247ec5aa-1401-422a-b9f7-71c8c9b4876e-operator-scripts\") pod \"247ec5aa-1401-422a-b9f7-71c8c9b4876e\" (UID: \"247ec5aa-1401-422a-b9f7-71c8c9b4876e\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.254377 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a9a8c73-828e-42ef-9818-6aab510e8240-operator-scripts\") pod \"3a9a8c73-828e-42ef-9818-6aab510e8240\" (UID: \"3a9a8c73-828e-42ef-9818-6aab510e8240\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.254478 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vk6t\" (UniqueName: \"kubernetes.io/projected/3a9a8c73-828e-42ef-9818-6aab510e8240-kube-api-access-6vk6t\") pod \"3a9a8c73-828e-42ef-9818-6aab510e8240\" (UID: \"3a9a8c73-828e-42ef-9818-6aab510e8240\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.254566 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "339d5b33-42f5-48e9-9420-a6ebc8a87a70" (UID: "339d5b33-42f5-48e9-9420-a6ebc8a87a70"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.255170 4708 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.256395 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/339d5b33-42f5-48e9-9420-a6ebc8a87a70-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "339d5b33-42f5-48e9-9420-a6ebc8a87a70" (UID: "339d5b33-42f5-48e9-9420-a6ebc8a87a70"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.256700 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/339d5b33-42f5-48e9-9420-a6ebc8a87a70-scripts" (OuterVolumeSpecName: "scripts") pod "339d5b33-42f5-48e9-9420-a6ebc8a87a70" (UID: "339d5b33-42f5-48e9-9420-a6ebc8a87a70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.256755 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "339d5b33-42f5-48e9-9420-a6ebc8a87a70" (UID: "339d5b33-42f5-48e9-9420-a6ebc8a87a70"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.256778 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-run" (OuterVolumeSpecName: "var-run") pod "339d5b33-42f5-48e9-9420-a6ebc8a87a70" (UID: "339d5b33-42f5-48e9-9420-a6ebc8a87a70"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.259142 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/247ec5aa-1401-422a-b9f7-71c8c9b4876e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "247ec5aa-1401-422a-b9f7-71c8c9b4876e" (UID: "247ec5aa-1401-422a-b9f7-71c8c9b4876e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.260562 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9a8c73-828e-42ef-9818-6aab510e8240-kube-api-access-6vk6t" (OuterVolumeSpecName: "kube-api-access-6vk6t") pod "3a9a8c73-828e-42ef-9818-6aab510e8240" (UID: "3a9a8c73-828e-42ef-9818-6aab510e8240"). InnerVolumeSpecName "kube-api-access-6vk6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.260750 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9a8c73-828e-42ef-9818-6aab510e8240-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a9a8c73-828e-42ef-9818-6aab510e8240" (UID: "3a9a8c73-828e-42ef-9818-6aab510e8240"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.261700 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/339d5b33-42f5-48e9-9420-a6ebc8a87a70-kube-api-access-2kcj5" (OuterVolumeSpecName: "kube-api-access-2kcj5") pod "339d5b33-42f5-48e9-9420-a6ebc8a87a70" (UID: "339d5b33-42f5-48e9-9420-a6ebc8a87a70"). InnerVolumeSpecName "kube-api-access-2kcj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.263313 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247ec5aa-1401-422a-b9f7-71c8c9b4876e-kube-api-access-9glz5" (OuterVolumeSpecName: "kube-api-access-9glz5") pod "247ec5aa-1401-422a-b9f7-71c8c9b4876e" (UID: "247ec5aa-1401-422a-b9f7-71c8c9b4876e"). InnerVolumeSpecName "kube-api-access-9glz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.357108 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26pw2\" (UniqueName: \"kubernetes.io/projected/1b15f38d-ded0-4fed-add0-c891d2208014-kube-api-access-26pw2\") pod \"1b15f38d-ded0-4fed-add0-c891d2208014\" (UID: \"1b15f38d-ded0-4fed-add0-c891d2208014\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.357174 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15badc78-5ab8-41aa-acfb-4bb1f28bcbab-operator-scripts\") pod \"15badc78-5ab8-41aa-acfb-4bb1f28bcbab\" (UID: \"15badc78-5ab8-41aa-acfb-4bb1f28bcbab\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.357227 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e28ace2-9e11-4223-b58d-91688cd2ced4-operator-scripts\") pod \"4e28ace2-9e11-4223-b58d-91688cd2ced4\" (UID: \"4e28ace2-9e11-4223-b58d-91688cd2ced4\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.357306 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949acc8b-4603-4567-9864-0328462133a0-operator-scripts\") pod \"949acc8b-4603-4567-9864-0328462133a0\" (UID: \"949acc8b-4603-4567-9864-0328462133a0\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.357441 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b15f38d-ded0-4fed-add0-c891d2208014-operator-scripts\") pod \"1b15f38d-ded0-4fed-add0-c891d2208014\" (UID: \"1b15f38d-ded0-4fed-add0-c891d2208014\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.357482 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26ppp\" (UniqueName: \"kubernetes.io/projected/15badc78-5ab8-41aa-acfb-4bb1f28bcbab-kube-api-access-26ppp\") pod \"15badc78-5ab8-41aa-acfb-4bb1f28bcbab\" (UID: \"15badc78-5ab8-41aa-acfb-4bb1f28bcbab\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.357591 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmwr8\" (UniqueName: \"kubernetes.io/projected/4e28ace2-9e11-4223-b58d-91688cd2ced4-kube-api-access-mmwr8\") pod \"4e28ace2-9e11-4223-b58d-91688cd2ced4\" (UID: \"4e28ace2-9e11-4223-b58d-91688cd2ced4\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.357710 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brzct\" (UniqueName: \"kubernetes.io/projected/949acc8b-4603-4567-9864-0328462133a0-kube-api-access-brzct\") pod \"949acc8b-4603-4567-9864-0328462133a0\" (UID: \"949acc8b-4603-4567-9864-0328462133a0\") " Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.357921 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15badc78-5ab8-41aa-acfb-4bb1f28bcbab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15badc78-5ab8-41aa-acfb-4bb1f28bcbab" (UID: "15badc78-5ab8-41aa-acfb-4bb1f28bcbab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.357996 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b15f38d-ded0-4fed-add0-c891d2208014-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b15f38d-ded0-4fed-add0-c891d2208014" (UID: "1b15f38d-ded0-4fed-add0-c891d2208014"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358140 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a9a8c73-828e-42ef-9818-6aab510e8240-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358165 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15badc78-5ab8-41aa-acfb-4bb1f28bcbab-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358179 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vk6t\" (UniqueName: \"kubernetes.io/projected/3a9a8c73-828e-42ef-9818-6aab510e8240-kube-api-access-6vk6t\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358196 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b15f38d-ded0-4fed-add0-c891d2208014-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358210 4708 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358218 4708 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/339d5b33-42f5-48e9-9420-a6ebc8a87a70-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358227 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/339d5b33-42f5-48e9-9420-a6ebc8a87a70-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358238 4708 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/339d5b33-42f5-48e9-9420-a6ebc8a87a70-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358249 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9glz5\" (UniqueName: \"kubernetes.io/projected/247ec5aa-1401-422a-b9f7-71c8c9b4876e-kube-api-access-9glz5\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358259 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kcj5\" (UniqueName: \"kubernetes.io/projected/339d5b33-42f5-48e9-9420-a6ebc8a87a70-kube-api-access-2kcj5\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358272 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/247ec5aa-1401-422a-b9f7-71c8c9b4876e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358552 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949acc8b-4603-4567-9864-0328462133a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "949acc8b-4603-4567-9864-0328462133a0" (UID: "949acc8b-4603-4567-9864-0328462133a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.358891 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e28ace2-9e11-4223-b58d-91688cd2ced4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e28ace2-9e11-4223-b58d-91688cd2ced4" (UID: "4e28ace2-9e11-4223-b58d-91688cd2ced4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.361065 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b15f38d-ded0-4fed-add0-c891d2208014-kube-api-access-26pw2" (OuterVolumeSpecName: "kube-api-access-26pw2") pod "1b15f38d-ded0-4fed-add0-c891d2208014" (UID: "1b15f38d-ded0-4fed-add0-c891d2208014"). InnerVolumeSpecName "kube-api-access-26pw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.362880 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e28ace2-9e11-4223-b58d-91688cd2ced4-kube-api-access-mmwr8" (OuterVolumeSpecName: "kube-api-access-mmwr8") pod "4e28ace2-9e11-4223-b58d-91688cd2ced4" (UID: "4e28ace2-9e11-4223-b58d-91688cd2ced4"). InnerVolumeSpecName "kube-api-access-mmwr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.363027 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949acc8b-4603-4567-9864-0328462133a0-kube-api-access-brzct" (OuterVolumeSpecName: "kube-api-access-brzct") pod "949acc8b-4603-4567-9864-0328462133a0" (UID: "949acc8b-4603-4567-9864-0328462133a0"). InnerVolumeSpecName "kube-api-access-brzct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.365053 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15badc78-5ab8-41aa-acfb-4bb1f28bcbab-kube-api-access-26ppp" (OuterVolumeSpecName: "kube-api-access-26ppp") pod "15badc78-5ab8-41aa-acfb-4bb1f28bcbab" (UID: "15badc78-5ab8-41aa-acfb-4bb1f28bcbab"). InnerVolumeSpecName "kube-api-access-26ppp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.459756 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brzct\" (UniqueName: \"kubernetes.io/projected/949acc8b-4603-4567-9864-0328462133a0-kube-api-access-brzct\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.460110 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26pw2\" (UniqueName: \"kubernetes.io/projected/1b15f38d-ded0-4fed-add0-c891d2208014-kube-api-access-26pw2\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.460123 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e28ace2-9e11-4223-b58d-91688cd2ced4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.460132 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949acc8b-4603-4567-9864-0328462133a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.460141 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26ppp\" (UniqueName: \"kubernetes.io/projected/15badc78-5ab8-41aa-acfb-4bb1f28bcbab-kube-api-access-26ppp\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.460149 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmwr8\" (UniqueName: \"kubernetes.io/projected/4e28ace2-9e11-4223-b58d-91688cd2ced4-kube-api-access-mmwr8\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.810935 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5805-account-create-update-2qn7m" event={"ID":"4e28ace2-9e11-4223-b58d-91688cd2ced4","Type":"ContainerDied","Data":"fe5708110fc8b898662531d88578cfca7e6d10486496a69ed9002beb2563e43a"} Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.810976 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe5708110fc8b898662531d88578cfca7e6d10486496a69ed9002beb2563e43a" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.811043 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5805-account-create-update-2qn7m" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.815035 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-22qr2-config-ghmvr" event={"ID":"339d5b33-42f5-48e9-9420-a6ebc8a87a70","Type":"ContainerDied","Data":"8fb8a196ae455134ea5c722b57c93388760dccc3a1a6e9348a15067004e7b566"} Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.815091 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fb8a196ae455134ea5c722b57c93388760dccc3a1a6e9348a15067004e7b566" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.815167 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-22qr2-config-ghmvr" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.825777 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nbvxk" event={"ID":"3a9a8c73-828e-42ef-9818-6aab510e8240","Type":"ContainerDied","Data":"6bcd0c950fa1e8753477b0047ffec120b1329dd7d3946acf217a67527081ffb6"} Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.825856 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bcd0c950fa1e8753477b0047ffec120b1329dd7d3946acf217a67527081ffb6" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.825946 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nbvxk" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.831930 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dg24d" event={"ID":"1b15f38d-ded0-4fed-add0-c891d2208014","Type":"ContainerDied","Data":"5b822e364bffae8f871b75e94898897edd22d5cbfba0c637a2066588f21a4c1b"} Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.832200 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b822e364bffae8f871b75e94898897edd22d5cbfba0c637a2066588f21a4c1b" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.832471 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dg24d" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.838964 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-z2bxx" event={"ID":"8d5e7b3a-c1c7-493a-a587-19d751f038be","Type":"ContainerStarted","Data":"16a13aff1fa1eacbf4b6a8eb05485a0bd31a5f831900c3ef98965ab8c09788f5"} Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.841495 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qnn97" event={"ID":"15badc78-5ab8-41aa-acfb-4bb1f28bcbab","Type":"ContainerDied","Data":"a1a296b47ae39b44644af3f619c5ec2415ec1bd7c7d03b0fe89748a20056591b"} Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.841530 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1a296b47ae39b44644af3f619c5ec2415ec1bd7c7d03b0fe89748a20056591b" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.841499 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qnn97" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.851895 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-42e7-account-create-update-jmvzr" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.851907 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-42e7-account-create-update-jmvzr" event={"ID":"949acc8b-4603-4567-9864-0328462133a0","Type":"ContainerDied","Data":"66573adc26884fe9ac19112b37dd18c3a58b65282f31909ea6ec59d3f04c21c0"} Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.851949 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66573adc26884fe9ac19112b37dd18c3a58b65282f31909ea6ec59d3f04c21c0" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.858449 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bed-account-create-update-8dhzm" event={"ID":"247ec5aa-1401-422a-b9f7-71c8c9b4876e","Type":"ContainerDied","Data":"78905953a09515201b7d7544117f6e9eb88f7ab20185c4a73ed0abf10751bb89"} Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.858686 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78905953a09515201b7d7544117f6e9eb88f7ab20185c4a73ed0abf10751bb89" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.858516 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bed-account-create-update-8dhzm" Mar 20 16:21:08 crc kubenswrapper[4708]: I0320 16:21:08.865355 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-z2bxx" podStartSLOduration=5.583994444 podStartE2EDuration="9.865331742s" podCreationTimestamp="2026-03-20 16:20:59 +0000 UTC" firstStartedPulling="2026-03-20 16:21:03.771088421 +0000 UTC m=+1218.445425126" lastFinishedPulling="2026-03-20 16:21:08.051958495 +0000 UTC m=+1222.726762424" observedRunningTime="2026-03-20 16:21:08.856395658 +0000 UTC m=+1223.530732403" watchObservedRunningTime="2026-03-20 16:21:08.865331742 +0000 UTC m=+1223.539668457" Mar 20 16:21:09 crc kubenswrapper[4708]: I0320 16:21:09.296624 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-22qr2-config-ghmvr"] Mar 20 16:21:09 crc kubenswrapper[4708]: I0320 16:21:09.303471 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-22qr2-config-ghmvr"] Mar 20 16:21:10 crc kubenswrapper[4708]: I0320 16:21:10.139904 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="339d5b33-42f5-48e9-9420-a6ebc8a87a70" path="/var/lib/kubelet/pods/339d5b33-42f5-48e9-9420-a6ebc8a87a70/volumes" Mar 20 16:21:10 crc kubenswrapper[4708]: I0320 16:21:10.875830 4708 generic.go:334] "Generic (PLEG): container finished" podID="ad381906-1b90-4230-a435-9ed844232ba1" containerID="86f51f8e74439206ebaa11043efc755d838b724e0976d43c4481fc6fc7aaf3f4" exitCode=0 Mar 20 16:21:10 crc kubenswrapper[4708]: I0320 16:21:10.875935 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s6247" event={"ID":"ad381906-1b90-4230-a435-9ed844232ba1","Type":"ContainerDied","Data":"86f51f8e74439206ebaa11043efc755d838b724e0976d43c4481fc6fc7aaf3f4"} Mar 20 16:21:11 crc kubenswrapper[4708]: I0320 16:21:11.887857 4708 generic.go:334] "Generic (PLEG): container finished" podID="8d5e7b3a-c1c7-493a-a587-19d751f038be" containerID="16a13aff1fa1eacbf4b6a8eb05485a0bd31a5f831900c3ef98965ab8c09788f5" exitCode=0 Mar 20 16:21:11 crc kubenswrapper[4708]: I0320 16:21:11.887948 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-z2bxx" event={"ID":"8d5e7b3a-c1c7-493a-a587-19d751f038be","Type":"ContainerDied","Data":"16a13aff1fa1eacbf4b6a8eb05485a0bd31a5f831900c3ef98965ab8c09788f5"} Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.163700 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.227473 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad381906-1b90-4230-a435-9ed844232ba1-scripts\") pod \"ad381906-1b90-4230-a435-9ed844232ba1\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.227532 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-dispersionconf\") pod \"ad381906-1b90-4230-a435-9ed844232ba1\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.227579 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-swiftconf\") pod \"ad381906-1b90-4230-a435-9ed844232ba1\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.227663 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad381906-1b90-4230-a435-9ed844232ba1-ring-data-devices\") pod \"ad381906-1b90-4230-a435-9ed844232ba1\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.227861 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pkpq\" (UniqueName: \"kubernetes.io/projected/ad381906-1b90-4230-a435-9ed844232ba1-kube-api-access-2pkpq\") pod \"ad381906-1b90-4230-a435-9ed844232ba1\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.227924 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-combined-ca-bundle\") pod \"ad381906-1b90-4230-a435-9ed844232ba1\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.227961 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad381906-1b90-4230-a435-9ed844232ba1-etc-swift\") pod \"ad381906-1b90-4230-a435-9ed844232ba1\" (UID: \"ad381906-1b90-4230-a435-9ed844232ba1\") " Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.229788 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad381906-1b90-4230-a435-9ed844232ba1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ad381906-1b90-4230-a435-9ed844232ba1" (UID: "ad381906-1b90-4230-a435-9ed844232ba1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.230275 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad381906-1b90-4230-a435-9ed844232ba1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ad381906-1b90-4230-a435-9ed844232ba1" (UID: "ad381906-1b90-4230-a435-9ed844232ba1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.233513 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad381906-1b90-4230-a435-9ed844232ba1-kube-api-access-2pkpq" (OuterVolumeSpecName: "kube-api-access-2pkpq") pod "ad381906-1b90-4230-a435-9ed844232ba1" (UID: "ad381906-1b90-4230-a435-9ed844232ba1"). InnerVolumeSpecName "kube-api-access-2pkpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.235875 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ad381906-1b90-4230-a435-9ed844232ba1" (UID: "ad381906-1b90-4230-a435-9ed844232ba1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.251348 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad381906-1b90-4230-a435-9ed844232ba1-scripts" (OuterVolumeSpecName: "scripts") pod "ad381906-1b90-4230-a435-9ed844232ba1" (UID: "ad381906-1b90-4230-a435-9ed844232ba1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.252309 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad381906-1b90-4230-a435-9ed844232ba1" (UID: "ad381906-1b90-4230-a435-9ed844232ba1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.253806 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ad381906-1b90-4230-a435-9ed844232ba1" (UID: "ad381906-1b90-4230-a435-9ed844232ba1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.329742 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pkpq\" (UniqueName: \"kubernetes.io/projected/ad381906-1b90-4230-a435-9ed844232ba1-kube-api-access-2pkpq\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.329782 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.329796 4708 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ad381906-1b90-4230-a435-9ed844232ba1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.329804 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad381906-1b90-4230-a435-9ed844232ba1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.329813 4708 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.329822 4708 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ad381906-1b90-4230-a435-9ed844232ba1-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.329831 4708 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ad381906-1b90-4230-a435-9ed844232ba1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.899707 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s6247" event={"ID":"ad381906-1b90-4230-a435-9ed844232ba1","Type":"ContainerDied","Data":"ca0586e2c8b4fb388de9b5436530f579afce5846274a2766a30ca656d1785491"} Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.900055 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca0586e2c8b4fb388de9b5436530f579afce5846274a2766a30ca656d1785491" Mar 20 16:21:12 crc kubenswrapper[4708]: I0320 16:21:12.899720 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s6247" Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.207088 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.346231 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5e7b3a-c1c7-493a-a587-19d751f038be-combined-ca-bundle\") pod \"8d5e7b3a-c1c7-493a-a587-19d751f038be\" (UID: \"8d5e7b3a-c1c7-493a-a587-19d751f038be\") " Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.346351 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6jvn\" (UniqueName: \"kubernetes.io/projected/8d5e7b3a-c1c7-493a-a587-19d751f038be-kube-api-access-p6jvn\") pod \"8d5e7b3a-c1c7-493a-a587-19d751f038be\" (UID: \"8d5e7b3a-c1c7-493a-a587-19d751f038be\") " Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.346511 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d5e7b3a-c1c7-493a-a587-19d751f038be-config-data\") pod \"8d5e7b3a-c1c7-493a-a587-19d751f038be\" (UID: \"8d5e7b3a-c1c7-493a-a587-19d751f038be\") " Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.352867 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5e7b3a-c1c7-493a-a587-19d751f038be-kube-api-access-p6jvn" (OuterVolumeSpecName: "kube-api-access-p6jvn") pod "8d5e7b3a-c1c7-493a-a587-19d751f038be" (UID: "8d5e7b3a-c1c7-493a-a587-19d751f038be"). InnerVolumeSpecName "kube-api-access-p6jvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.371814 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5e7b3a-c1c7-493a-a587-19d751f038be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d5e7b3a-c1c7-493a-a587-19d751f038be" (UID: "8d5e7b3a-c1c7-493a-a587-19d751f038be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.387784 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5e7b3a-c1c7-493a-a587-19d751f038be-config-data" (OuterVolumeSpecName: "config-data") pod "8d5e7b3a-c1c7-493a-a587-19d751f038be" (UID: "8d5e7b3a-c1c7-493a-a587-19d751f038be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.448400 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d5e7b3a-c1c7-493a-a587-19d751f038be-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.448450 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5e7b3a-c1c7-493a-a587-19d751f038be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.448464 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6jvn\" (UniqueName: \"kubernetes.io/projected/8d5e7b3a-c1c7-493a-a587-19d751f038be-kube-api-access-p6jvn\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.907913 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-z2bxx" event={"ID":"8d5e7b3a-c1c7-493a-a587-19d751f038be","Type":"ContainerDied","Data":"206010f36a96475fb25849a00d97c70ac88e4e03d43f77787adf34f3379c1620"} Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.908157 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="206010f36a96475fb25849a00d97c70ac88e4e03d43f77787adf34f3379c1620" Mar 20 16:21:13 crc kubenswrapper[4708]: I0320 16:21:13.907982 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-z2bxx" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161024 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-rnh8s"] Mar 20 16:21:14 crc kubenswrapper[4708]: E0320 16:21:14.161388 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5e7b3a-c1c7-493a-a587-19d751f038be" containerName="keystone-db-sync" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161402 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5e7b3a-c1c7-493a-a587-19d751f038be" containerName="keystone-db-sync" Mar 20 16:21:14 crc kubenswrapper[4708]: E0320 16:21:14.161415 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad381906-1b90-4230-a435-9ed844232ba1" containerName="swift-ring-rebalance" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161421 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad381906-1b90-4230-a435-9ed844232ba1" containerName="swift-ring-rebalance" Mar 20 16:21:14 crc kubenswrapper[4708]: E0320 16:21:14.161428 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339d5b33-42f5-48e9-9420-a6ebc8a87a70" containerName="ovn-config" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161435 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="339d5b33-42f5-48e9-9420-a6ebc8a87a70" containerName="ovn-config" Mar 20 16:21:14 crc kubenswrapper[4708]: E0320 16:21:14.161447 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37996e7-c038-4444-bbe4-df1f32b2b029" containerName="dnsmasq-dns" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161453 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37996e7-c038-4444-bbe4-df1f32b2b029" containerName="dnsmasq-dns" Mar 20 16:21:14 crc kubenswrapper[4708]: E0320 16:21:14.161464 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247ec5aa-1401-422a-b9f7-71c8c9b4876e" containerName="mariadb-account-create-update" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161470 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="247ec5aa-1401-422a-b9f7-71c8c9b4876e" containerName="mariadb-account-create-update" Mar 20 16:21:14 crc kubenswrapper[4708]: E0320 16:21:14.161479 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e28ace2-9e11-4223-b58d-91688cd2ced4" containerName="mariadb-account-create-update" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161486 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e28ace2-9e11-4223-b58d-91688cd2ced4" containerName="mariadb-account-create-update" Mar 20 16:21:14 crc kubenswrapper[4708]: E0320 16:21:14.161496 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b15f38d-ded0-4fed-add0-c891d2208014" containerName="mariadb-database-create" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161504 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b15f38d-ded0-4fed-add0-c891d2208014" containerName="mariadb-database-create" Mar 20 16:21:14 crc kubenswrapper[4708]: E0320 16:21:14.161516 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949acc8b-4603-4567-9864-0328462133a0" containerName="mariadb-account-create-update" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161524 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="949acc8b-4603-4567-9864-0328462133a0" containerName="mariadb-account-create-update" Mar 20 16:21:14 crc kubenswrapper[4708]: E0320 16:21:14.161536 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15badc78-5ab8-41aa-acfb-4bb1f28bcbab" containerName="mariadb-database-create" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161542 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="15badc78-5ab8-41aa-acfb-4bb1f28bcbab" containerName="mariadb-database-create" Mar 20 16:21:14 crc kubenswrapper[4708]: E0320 16:21:14.161551 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9a8c73-828e-42ef-9818-6aab510e8240" containerName="mariadb-database-create" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161556 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9a8c73-828e-42ef-9818-6aab510e8240" containerName="mariadb-database-create" Mar 20 16:21:14 crc kubenswrapper[4708]: E0320 16:21:14.161563 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37996e7-c038-4444-bbe4-df1f32b2b029" containerName="init" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161569 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37996e7-c038-4444-bbe4-df1f32b2b029" containerName="init" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161748 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d5e7b3a-c1c7-493a-a587-19d751f038be" containerName="keystone-db-sync" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161758 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9a8c73-828e-42ef-9818-6aab510e8240" containerName="mariadb-database-create" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161770 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="339d5b33-42f5-48e9-9420-a6ebc8a87a70" containerName="ovn-config" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161779 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e28ace2-9e11-4223-b58d-91688cd2ced4" containerName="mariadb-account-create-update" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161790 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="15badc78-5ab8-41aa-acfb-4bb1f28bcbab" containerName="mariadb-database-create" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161800 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="247ec5aa-1401-422a-b9f7-71c8c9b4876e" containerName="mariadb-account-create-update" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161809 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad381906-1b90-4230-a435-9ed844232ba1" containerName="swift-ring-rebalance" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161821 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b15f38d-ded0-4fed-add0-c891d2208014" containerName="mariadb-database-create" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161829 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37996e7-c038-4444-bbe4-df1f32b2b029" containerName="dnsmasq-dns" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.161837 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="949acc8b-4603-4567-9864-0328462133a0" containerName="mariadb-account-create-update" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.162608 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.185263 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-rnh8s"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.227790 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-cb98m"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.228960 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.234462 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.234770 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.234898 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.235040 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vtmgw" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.235211 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.240290 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cb98m"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.264987 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-config\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.265430 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.265551 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-dns-svc\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.265656 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqvr4\" (UniqueName: \"kubernetes.io/projected/4e348306-307b-4b7f-b42b-27501b1d8d9a-kube-api-access-xqvr4\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.265908 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.371119 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.371226 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-config\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.371272 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-scripts\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.371326 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmrt8\" (UniqueName: \"kubernetes.io/projected/d4e9df92-5542-4a9e-8a92-44dd5286423f-kube-api-access-dmrt8\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.371353 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-credential-keys\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.371372 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.371419 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-dns-svc\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.371444 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqvr4\" (UniqueName: \"kubernetes.io/projected/4e348306-307b-4b7f-b42b-27501b1d8d9a-kube-api-access-xqvr4\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.371462 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-config-data\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.371508 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-fernet-keys\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.371525 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-combined-ca-bundle\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.372196 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.372302 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-config\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.372996 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.374021 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-dns-svc\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.390934 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fgjlj"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.397306 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.402454 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.402482 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.402551 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kmxkf" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.406977 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68f5b8d549-7854k"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.407482 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqvr4\" (UniqueName: \"kubernetes.io/projected/4e348306-307b-4b7f-b42b-27501b1d8d9a-kube-api-access-xqvr4\") pod \"dnsmasq-dns-f877ddd87-rnh8s\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.408451 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.410411 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.410612 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-pvsp2" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.414271 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.414536 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.415414 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fgjlj"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.443107 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68f5b8d549-7854k"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473416 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-scripts\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473469 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-logs\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473510 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-horizon-secret-key\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473538 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c46a759f-98ab-495d-9cab-ba1f2fbbb112-etc-machine-id\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473585 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-combined-ca-bundle\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473607 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-config-data\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473630 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-db-sync-config-data\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473656 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-scripts\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473798 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzb9q\" (UniqueName: \"kubernetes.io/projected/c46a759f-98ab-495d-9cab-ba1f2fbbb112-kube-api-access-nzb9q\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473823 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmrt8\" (UniqueName: \"kubernetes.io/projected/d4e9df92-5542-4a9e-8a92-44dd5286423f-kube-api-access-dmrt8\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473852 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-credential-keys\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473872 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-scripts\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473893 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-config-data\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473912 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-config-data\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473948 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-fernet-keys\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473965 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbz2h\" (UniqueName: \"kubernetes.io/projected/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-kube-api-access-wbz2h\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.473984 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-combined-ca-bundle\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.485128 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-credential-keys\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.485561 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-config-data\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.492099 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.492460 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-fernet-keys\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.493055 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-combined-ca-bundle\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.498454 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-scripts\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.500145 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmrt8\" (UniqueName: \"kubernetes.io/projected/d4e9df92-5542-4a9e-8a92-44dd5286423f-kube-api-access-dmrt8\") pod \"keystone-bootstrap-cb98m\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.551726 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-sr6vd"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.565312 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.577227 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.577748 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wgwkm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.577871 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.577985 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-combined-ca-bundle\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.578021 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-config-data\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.578046 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-db-sync-config-data\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.578087 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzb9q\" (UniqueName: \"kubernetes.io/projected/c46a759f-98ab-495d-9cab-ba1f2fbbb112-kube-api-access-nzb9q\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.578114 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-scripts\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.578131 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-config-data\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.578169 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbz2h\" (UniqueName: \"kubernetes.io/projected/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-kube-api-access-wbz2h\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.578187 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-scripts\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.578204 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-logs\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.578231 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-horizon-secret-key\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.578259 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c46a759f-98ab-495d-9cab-ba1f2fbbb112-etc-machine-id\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.578326 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c46a759f-98ab-495d-9cab-ba1f2fbbb112-etc-machine-id\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.578953 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-scripts\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.579423 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-logs\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.579829 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.584505 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sr6vd"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.587495 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-config-data\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.588275 4708 scope.go:117] "RemoveContainer" containerID="33f4fa4a7b7372a5c55fead11f4ec4eded6db7d99e0aec9664fbcfe988e06cc3" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.592711 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-scripts\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.593539 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-config-data\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.620626 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-horizon-secret-key\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.622353 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbz2h\" (UniqueName: \"kubernetes.io/projected/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-kube-api-access-wbz2h\") pod \"horizon-68f5b8d549-7854k\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.624488 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-combined-ca-bundle\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.624826 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-zk9cx"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.627475 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.638588 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.638794 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-g9d6t" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.639372 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzb9q\" (UniqueName: \"kubernetes.io/projected/c46a759f-98ab-495d-9cab-ba1f2fbbb112-kube-api-access-nzb9q\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.645332 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-db-sync-config-data\") pod \"cinder-db-sync-fgjlj\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.681089 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-config\") pod \"neutron-db-sync-sr6vd\" (UID: \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\") " pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.681389 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-combined-ca-bundle\") pod \"neutron-db-sync-sr6vd\" (UID: \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\") " pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.682014 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phsjk\" (UniqueName: \"kubernetes.io/projected/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-kube-api-access-phsjk\") pod \"neutron-db-sync-sr6vd\" (UID: \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\") " pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.685459 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zk9cx"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.710105 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-rnh8s"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.730697 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-7q7vc"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.732161 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.738060 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.738418 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.738533 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2gwpt" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.743659 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7q7vc"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.767539 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.771880 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.783209 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brxlv\" (UniqueName: \"kubernetes.io/projected/204070bf-f103-49d9-b366-185454e68b9e-kube-api-access-brxlv\") pod \"barbican-db-sync-zk9cx\" (UID: \"204070bf-f103-49d9-b366-185454e68b9e\") " pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.783283 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-config\") pod \"neutron-db-sync-sr6vd\" (UID: \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\") " pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.783313 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/204070bf-f103-49d9-b366-185454e68b9e-db-sync-config-data\") pod \"barbican-db-sync-zk9cx\" (UID: \"204070bf-f103-49d9-b366-185454e68b9e\") " pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.783420 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-combined-ca-bundle\") pod \"neutron-db-sync-sr6vd\" (UID: \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\") " pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.783455 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204070bf-f103-49d9-b366-185454e68b9e-combined-ca-bundle\") pod \"barbican-db-sync-zk9cx\" (UID: \"204070bf-f103-49d9-b366-185454e68b9e\") " pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.783508 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phsjk\" (UniqueName: \"kubernetes.io/projected/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-kube-api-access-phsjk\") pod \"neutron-db-sync-sr6vd\" (UID: \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\") " pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.789070 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86b6976b49-4hvbm"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.790280 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-config\") pod \"neutron-db-sync-sr6vd\" (UID: \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\") " pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.790477 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.791008 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-combined-ca-bundle\") pod \"neutron-db-sync-sr6vd\" (UID: \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\") " pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.800380 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-w5nl5"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.802287 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.816352 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86b6976b49-4hvbm"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.819874 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phsjk\" (UniqueName: \"kubernetes.io/projected/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-kube-api-access-phsjk\") pod \"neutron-db-sync-sr6vd\" (UID: \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\") " pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.858866 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-w5nl5"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.886160 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204070bf-f103-49d9-b366-185454e68b9e-combined-ca-bundle\") pod \"barbican-db-sync-zk9cx\" (UID: \"204070bf-f103-49d9-b366-185454e68b9e\") " pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.886708 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrm2f\" (UniqueName: \"kubernetes.io/projected/91b42c5b-3a13-4405-b679-546a26c2e78e-kube-api-access-xrm2f\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.886802 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.886844 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91b42c5b-3a13-4405-b679-546a26c2e78e-horizon-secret-key\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.888219 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91b42c5b-3a13-4405-b679-546a26c2e78e-config-data\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.888288 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brxlv\" (UniqueName: \"kubernetes.io/projected/204070bf-f103-49d9-b366-185454e68b9e-kube-api-access-brxlv\") pod \"barbican-db-sync-zk9cx\" (UID: \"204070bf-f103-49d9-b366-185454e68b9e\") " pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.888334 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91b42c5b-3a13-4405-b679-546a26c2e78e-scripts\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.888383 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.888427 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ded2837-c536-490b-a13c-2a09ea07a7aa-logs\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.888465 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w9pj\" (UniqueName: \"kubernetes.io/projected/11974898-2dd8-4e18-9d89-64442e4dce69-kube-api-access-8w9pj\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.888496 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-combined-ca-bundle\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.888520 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.889104 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91b42c5b-3a13-4405-b679-546a26c2e78e-logs\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.889141 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/204070bf-f103-49d9-b366-185454e68b9e-db-sync-config-data\") pod \"barbican-db-sync-zk9cx\" (UID: \"204070bf-f103-49d9-b366-185454e68b9e\") " pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.889225 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-config-data\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.889261 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-scripts\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.889646 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-config\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.889757 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz82t\" (UniqueName: \"kubernetes.io/projected/3ded2837-c536-490b-a13c-2a09ea07a7aa-kube-api-access-zz82t\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.910432 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/204070bf-f103-49d9-b366-185454e68b9e-db-sync-config-data\") pod \"barbican-db-sync-zk9cx\" (UID: \"204070bf-f103-49d9-b366-185454e68b9e\") " pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.921468 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204070bf-f103-49d9-b366-185454e68b9e-combined-ca-bundle\") pod \"barbican-db-sync-zk9cx\" (UID: \"204070bf-f103-49d9-b366-185454e68b9e\") " pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.927560 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brxlv\" (UniqueName: \"kubernetes.io/projected/204070bf-f103-49d9-b366-185454e68b9e-kube-api-access-brxlv\") pod \"barbican-db-sync-zk9cx\" (UID: \"204070bf-f103-49d9-b366-185454e68b9e\") " pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.962010 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.964529 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.966878 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.966932 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.982061 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.987560 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.991184 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91b42c5b-3a13-4405-b679-546a26c2e78e-logs\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.991247 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5121a54a-778a-4b46-9726-a4ba2901042b-log-httpd\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.991275 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-config-data\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.991308 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-scripts\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.991339 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.991356 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr97h\" (UniqueName: \"kubernetes.io/projected/5121a54a-778a-4b46-9726-a4ba2901042b-kube-api-access-kr97h\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.991375 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-config\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.991425 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-scripts\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.992232 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-config\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.992325 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz82t\" (UniqueName: \"kubernetes.io/projected/3ded2837-c536-490b-a13c-2a09ea07a7aa-kube-api-access-zz82t\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.992538 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91b42c5b-3a13-4405-b679-546a26c2e78e-logs\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.992613 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrm2f\" (UniqueName: \"kubernetes.io/projected/91b42c5b-3a13-4405-b679-546a26c2e78e-kube-api-access-xrm2f\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.992813 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.992842 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-config-data\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.992861 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.992923 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91b42c5b-3a13-4405-b679-546a26c2e78e-horizon-secret-key\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.992958 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91b42c5b-3a13-4405-b679-546a26c2e78e-config-data\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.993043 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91b42c5b-3a13-4405-b679-546a26c2e78e-scripts\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.993064 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.993097 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ded2837-c536-490b-a13c-2a09ea07a7aa-logs\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.993129 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w9pj\" (UniqueName: \"kubernetes.io/projected/11974898-2dd8-4e18-9d89-64442e4dce69-kube-api-access-8w9pj\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.993155 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5121a54a-778a-4b46-9726-a4ba2901042b-run-httpd\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.993174 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.993190 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-combined-ca-bundle\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.994456 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ded2837-c536-490b-a13c-2a09ea07a7aa-logs\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.995056 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91b42c5b-3a13-4405-b679-546a26c2e78e-config-data\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.995112 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91b42c5b-3a13-4405-b679-546a26c2e78e-scripts\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.995194 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.995637 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.996029 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.996518 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-config-data\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.996969 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-combined-ca-bundle\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:14 crc kubenswrapper[4708]: I0320 16:21:14.997519 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91b42c5b-3a13-4405-b679-546a26c2e78e-horizon-secret-key\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.010216 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrm2f\" (UniqueName: \"kubernetes.io/projected/91b42c5b-3a13-4405-b679-546a26c2e78e-kube-api-access-xrm2f\") pod \"horizon-86b6976b49-4hvbm\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.020770 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w9pj\" (UniqueName: \"kubernetes.io/projected/11974898-2dd8-4e18-9d89-64442e4dce69-kube-api-access-8w9pj\") pod \"dnsmasq-dns-68dcc9cf6f-w5nl5\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.021132 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-scripts\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.025124 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz82t\" (UniqueName: \"kubernetes.io/projected/3ded2837-c536-490b-a13c-2a09ea07a7aa-kube-api-access-zz82t\") pod \"placement-db-sync-7q7vc\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.056117 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.090964 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.094446 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5121a54a-778a-4b46-9726-a4ba2901042b-run-httpd\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.094507 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5121a54a-778a-4b46-9726-a4ba2901042b-log-httpd\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.094540 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.094556 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr97h\" (UniqueName: \"kubernetes.io/projected/5121a54a-778a-4b46-9726-a4ba2901042b-kube-api-access-kr97h\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.094576 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-scripts\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.094622 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-config-data\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.094645 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.095516 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5121a54a-778a-4b46-9726-a4ba2901042b-log-httpd\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.095838 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5121a54a-778a-4b46-9726-a4ba2901042b-run-httpd\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.100062 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.104863 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.106702 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-config-data\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.109209 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-scripts\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.117317 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr97h\" (UniqueName: \"kubernetes.io/projected/5121a54a-778a-4b46-9726-a4ba2901042b-kube-api-access-kr97h\") pod \"ceilometer-0\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.149783 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.161848 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.241714 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-rnh8s"] Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.299377 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.434965 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cb98m"] Mar 20 16:21:15 crc kubenswrapper[4708]: W0320 16:21:15.447221 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4e9df92_5542_4a9e_8a92_44dd5286423f.slice/crio-ca4a77910580b62a7b3739dc3a9ba2f47e87138c05acc00c08101c0c72ce90b6 WatchSource:0}: Error finding container ca4a77910580b62a7b3739dc3a9ba2f47e87138c05acc00c08101c0c72ce90b6: Status 404 returned error can't find the container with id ca4a77910580b62a7b3739dc3a9ba2f47e87138c05acc00c08101c0c72ce90b6 Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.545338 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68f5b8d549-7854k"] Mar 20 16:21:15 crc kubenswrapper[4708]: W0320 16:21:15.556116 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f5b840a_bd1e_4e3e_b970_c531e84a22ff.slice/crio-4c32c46dbc8e9d2135302903295706ee8698675219b303b0a475b262fea671be WatchSource:0}: Error finding container 4c32c46dbc8e9d2135302903295706ee8698675219b303b0a475b262fea671be: Status 404 returned error can't find the container with id 4c32c46dbc8e9d2135302903295706ee8698675219b303b0a475b262fea671be Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.687496 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sr6vd"] Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.718049 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fgjlj"] Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.806765 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7q7vc"] Mar 20 16:21:15 crc kubenswrapper[4708]: W0320 16:21:15.814170 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ded2837_c536_490b_a13c_2a09ea07a7aa.slice/crio-15fd3b4ce0fe54562d451590754f685f086e3116e19546500c2a9179022ab7d6 WatchSource:0}: Error finding container 15fd3b4ce0fe54562d451590754f685f086e3116e19546500c2a9179022ab7d6: Status 404 returned error can't find the container with id 15fd3b4ce0fe54562d451590754f685f086e3116e19546500c2a9179022ab7d6 Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.815629 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zk9cx"] Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.905177 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68f5b8d549-7854k"] Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.958480 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56c48d8cfc-q27gm"] Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.961919 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.979068 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56c48d8cfc-q27gm"] Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.991192 4708 generic.go:334] "Generic (PLEG): container finished" podID="4e348306-307b-4b7f-b42b-27501b1d8d9a" containerID="f6e832c6479f891a206c7cc05080b27fac0f60493c23b8fd1edc14157cb73812" exitCode=0 Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.991244 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" event={"ID":"4e348306-307b-4b7f-b42b-27501b1d8d9a","Type":"ContainerDied","Data":"f6e832c6479f891a206c7cc05080b27fac0f60493c23b8fd1edc14157cb73812"} Mar 20 16:21:15 crc kubenswrapper[4708]: I0320 16:21:15.991269 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" event={"ID":"4e348306-307b-4b7f-b42b-27501b1d8d9a","Type":"ContainerStarted","Data":"0a8e48ec1e2f8c925410ef1ffacff44feca60c574277ab59fceb24004d4b619b"} Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:15.998112 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fgjlj" event={"ID":"c46a759f-98ab-495d-9cab-ba1f2fbbb112","Type":"ContainerStarted","Data":"0aba4e1c947c8a0b7300103b6779e39a983f76d5367b8563b8d8e64b750837f3"} Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:15.999056 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86b6976b49-4hvbm"] Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.011037 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68f5b8d549-7854k" event={"ID":"0f5b840a-bd1e-4e3e-b970-c531e84a22ff","Type":"ContainerStarted","Data":"4c32c46dbc8e9d2135302903295706ee8698675219b303b0a475b262fea671be"} Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.023995 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.025230 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnbf6\" (UniqueName: \"kubernetes.io/projected/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-kube-api-access-qnbf6\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.025277 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-config-data\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.025305 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-logs\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.025355 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-scripts\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.025456 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-horizon-secret-key\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.034521 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sr6vd" event={"ID":"52e4d34b-0c95-475c-b9e5-be1dff27d5a3","Type":"ContainerStarted","Data":"3b77373fd4ca356a1459aa2e3706bfa9ae9fe7760173b67b5d9aebd29dc326dd"} Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.039051 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cb98m" event={"ID":"d4e9df92-5542-4a9e-8a92-44dd5286423f","Type":"ContainerStarted","Data":"05b518bfc1c6f70dc0452b73d35a9222f4c8b3d89de1a070a700bb1630575c98"} Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.039111 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cb98m" event={"ID":"d4e9df92-5542-4a9e-8a92-44dd5286423f","Type":"ContainerStarted","Data":"ca4a77910580b62a7b3739dc3a9ba2f47e87138c05acc00c08101c0c72ce90b6"} Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.039319 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-w5nl5"] Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.059182 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7q7vc" event={"ID":"3ded2837-c536-490b-a13c-2a09ea07a7aa","Type":"ContainerStarted","Data":"15fd3b4ce0fe54562d451590754f685f086e3116e19546500c2a9179022ab7d6"} Mar 20 16:21:16 crc kubenswrapper[4708]: W0320 16:21:16.062362 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91b42c5b_3a13_4405_b679_546a26c2e78e.slice/crio-31f41f615ec9714c586b7a6d19d35e45b863b9b593b2e84ad25aed8dd7f0f0a9 WatchSource:0}: Error finding container 31f41f615ec9714c586b7a6d19d35e45b863b9b593b2e84ad25aed8dd7f0f0a9: Status 404 returned error can't find the container with id 31f41f615ec9714c586b7a6d19d35e45b863b9b593b2e84ad25aed8dd7f0f0a9 Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.083620 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zk9cx" event={"ID":"204070bf-f103-49d9-b366-185454e68b9e","Type":"ContainerStarted","Data":"2bc5dab68aa1bb97b34e64d6a0cbbdae5d66c01c39fdc16d14813532c15e43f3"} Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.129666 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-horizon-secret-key\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.129722 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnbf6\" (UniqueName: \"kubernetes.io/projected/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-kube-api-access-qnbf6\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.129766 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-config-data\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.129790 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-logs\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.129854 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-scripts\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.171222 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-horizon-secret-key\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.172719 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-scripts\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.172742 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-config-data\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.172950 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-logs\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.291157 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnbf6\" (UniqueName: \"kubernetes.io/projected/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-kube-api-access-qnbf6\") pod \"horizon-56c48d8cfc-q27gm\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.291285 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-cb98m" podStartSLOduration=2.291261956 podStartE2EDuration="2.291261956s" podCreationTimestamp="2026-03-20 16:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:16.284592424 +0000 UTC m=+1230.958929139" watchObservedRunningTime="2026-03-20 16:21:16.291261956 +0000 UTC m=+1230.965598681" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.318995 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.446585 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.625322 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.708923 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-dns-svc\") pod \"4e348306-307b-4b7f-b42b-27501b1d8d9a\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.708996 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqvr4\" (UniqueName: \"kubernetes.io/projected/4e348306-307b-4b7f-b42b-27501b1d8d9a-kube-api-access-xqvr4\") pod \"4e348306-307b-4b7f-b42b-27501b1d8d9a\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.709030 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-ovsdbserver-sb\") pod \"4e348306-307b-4b7f-b42b-27501b1d8d9a\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.709100 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-config\") pod \"4e348306-307b-4b7f-b42b-27501b1d8d9a\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.709169 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-ovsdbserver-nb\") pod \"4e348306-307b-4b7f-b42b-27501b1d8d9a\" (UID: \"4e348306-307b-4b7f-b42b-27501b1d8d9a\") " Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.722631 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e348306-307b-4b7f-b42b-27501b1d8d9a-kube-api-access-xqvr4" (OuterVolumeSpecName: "kube-api-access-xqvr4") pod "4e348306-307b-4b7f-b42b-27501b1d8d9a" (UID: "4e348306-307b-4b7f-b42b-27501b1d8d9a"). InnerVolumeSpecName "kube-api-access-xqvr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.750327 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e348306-307b-4b7f-b42b-27501b1d8d9a" (UID: "4e348306-307b-4b7f-b42b-27501b1d8d9a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.765227 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-config" (OuterVolumeSpecName: "config") pod "4e348306-307b-4b7f-b42b-27501b1d8d9a" (UID: "4e348306-307b-4b7f-b42b-27501b1d8d9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.766934 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e348306-307b-4b7f-b42b-27501b1d8d9a" (UID: "4e348306-307b-4b7f-b42b-27501b1d8d9a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.770192 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e348306-307b-4b7f-b42b-27501b1d8d9a" (UID: "4e348306-307b-4b7f-b42b-27501b1d8d9a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.811552 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.811592 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.811605 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqvr4\" (UniqueName: \"kubernetes.io/projected/4e348306-307b-4b7f-b42b-27501b1d8d9a-kube-api-access-xqvr4\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.811619 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:16 crc kubenswrapper[4708]: I0320 16:21:16.811630 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e348306-307b-4b7f-b42b-27501b1d8d9a-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.041493 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56c48d8cfc-q27gm"] Mar 20 16:21:17 crc kubenswrapper[4708]: W0320 16:21:17.053941 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73f2d616_aca1_4db4_890a_e6b02eaa9f4a.slice/crio-d98f7c935b937e6c51a555193c4238aa31edfec97324d1e45fa5ebb635215fba WatchSource:0}: Error finding container d98f7c935b937e6c51a555193c4238aa31edfec97324d1e45fa5ebb635215fba: Status 404 returned error can't find the container with id d98f7c935b937e6c51a555193c4238aa31edfec97324d1e45fa5ebb635215fba Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.099930 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" event={"ID":"4e348306-307b-4b7f-b42b-27501b1d8d9a","Type":"ContainerDied","Data":"0a8e48ec1e2f8c925410ef1ffacff44feca60c574277ab59fceb24004d4b619b"} Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.099986 4708 scope.go:117] "RemoveContainer" containerID="f6e832c6479f891a206c7cc05080b27fac0f60493c23b8fd1edc14157cb73812" Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.100000 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-rnh8s" Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.102256 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86b6976b49-4hvbm" event={"ID":"91b42c5b-3a13-4405-b679-546a26c2e78e","Type":"ContainerStarted","Data":"31f41f615ec9714c586b7a6d19d35e45b863b9b593b2e84ad25aed8dd7f0f0a9"} Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.109864 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56c48d8cfc-q27gm" event={"ID":"73f2d616-aca1-4db4-890a-e6b02eaa9f4a","Type":"ContainerStarted","Data":"d98f7c935b937e6c51a555193c4238aa31edfec97324d1e45fa5ebb635215fba"} Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.115793 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sr6vd" event={"ID":"52e4d34b-0c95-475c-b9e5-be1dff27d5a3","Type":"ContainerStarted","Data":"612769ba32a5de6e87931c0a149f996bebf73631f256e8394b11aa927a7f24f1"} Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.121115 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5121a54a-778a-4b46-9726-a4ba2901042b","Type":"ContainerStarted","Data":"5eac20424b38411cf6b3824526f41b6e8d68e9b2ceb26101fb1c396149ee0572"} Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.123843 4708 generic.go:334] "Generic (PLEG): container finished" podID="11974898-2dd8-4e18-9d89-64442e4dce69" containerID="afd99ff6ac9de8ab71d274a0212a70d25d2ad98998cf887508a60041c2ce366b" exitCode=0 Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.125270 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" event={"ID":"11974898-2dd8-4e18-9d89-64442e4dce69","Type":"ContainerDied","Data":"afd99ff6ac9de8ab71d274a0212a70d25d2ad98998cf887508a60041c2ce366b"} Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.125348 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" event={"ID":"11974898-2dd8-4e18-9d89-64442e4dce69","Type":"ContainerStarted","Data":"d5e61fae0342423cb2f46edbd46b91181877e9ffe8240c18e98ef90a03ad32d2"} Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.192772 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-sr6vd" podStartSLOduration=3.192748706 podStartE2EDuration="3.192748706s" podCreationTimestamp="2026-03-20 16:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:17.143066955 +0000 UTC m=+1231.817403690" watchObservedRunningTime="2026-03-20 16:21:17.192748706 +0000 UTC m=+1231.867085441" Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.305736 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-rnh8s"] Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.322484 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-rnh8s"] Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.629992 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.636355 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b5b88259-a142-42a0-ab2c-bb0980ad9465-etc-swift\") pod \"swift-storage-0\" (UID: \"b5b88259-a142-42a0-ab2c-bb0980ad9465\") " pod="openstack/swift-storage-0" Mar 20 16:21:17 crc kubenswrapper[4708]: I0320 16:21:17.721962 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 16:21:18 crc kubenswrapper[4708]: I0320 16:21:18.149135 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e348306-307b-4b7f-b42b-27501b1d8d9a" path="/var/lib/kubelet/pods/4e348306-307b-4b7f-b42b-27501b1d8d9a/volumes" Mar 20 16:21:18 crc kubenswrapper[4708]: I0320 16:21:18.152818 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rpnp8" event={"ID":"e73e6a53-ccd4-45bf-ad96-a6de1e696888","Type":"ContainerStarted","Data":"26b6e59e594f71b347641897b67adbb4240aece918c6018f1e81ff0c16b097cd"} Mar 20 16:21:18 crc kubenswrapper[4708]: I0320 16:21:18.157361 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" event={"ID":"11974898-2dd8-4e18-9d89-64442e4dce69","Type":"ContainerStarted","Data":"f84bd0d3b9c8131f026032beb565feac8cb56887349ac08bac72d55be8487821"} Mar 20 16:21:18 crc kubenswrapper[4708]: I0320 16:21:18.158009 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:18 crc kubenswrapper[4708]: I0320 16:21:18.195042 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-rpnp8" podStartSLOduration=3.592303844 podStartE2EDuration="36.195022704s" podCreationTimestamp="2026-03-20 16:20:42 +0000 UTC" firstStartedPulling="2026-03-20 16:20:43.34824241 +0000 UTC m=+1198.022579135" lastFinishedPulling="2026-03-20 16:21:15.95096128 +0000 UTC m=+1230.625297995" observedRunningTime="2026-03-20 16:21:18.172187658 +0000 UTC m=+1232.846524393" watchObservedRunningTime="2026-03-20 16:21:18.195022704 +0000 UTC m=+1232.869359419" Mar 20 16:21:18 crc kubenswrapper[4708]: I0320 16:21:18.207425 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" podStartSLOduration=4.207408023 podStartE2EDuration="4.207408023s" podCreationTimestamp="2026-03-20 16:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:18.200288068 +0000 UTC m=+1232.874624793" watchObservedRunningTime="2026-03-20 16:21:18.207408023 +0000 UTC m=+1232.881744738" Mar 20 16:21:18 crc kubenswrapper[4708]: I0320 16:21:18.378190 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 16:21:18 crc kubenswrapper[4708]: W0320 16:21:18.388410 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5b88259_a142_42a0_ab2c_bb0980ad9465.slice/crio-dd8b29e5169436b3f2cf2d553e7c335f7d4a06a2ad8e72010bac3e96eea0189e WatchSource:0}: Error finding container dd8b29e5169436b3f2cf2d553e7c335f7d4a06a2ad8e72010bac3e96eea0189e: Status 404 returned error can't find the container with id dd8b29e5169436b3f2cf2d553e7c335f7d4a06a2ad8e72010bac3e96eea0189e Mar 20 16:21:19 crc kubenswrapper[4708]: I0320 16:21:19.191694 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"dd8b29e5169436b3f2cf2d553e7c335f7d4a06a2ad8e72010bac3e96eea0189e"} Mar 20 16:21:21 crc kubenswrapper[4708]: I0320 16:21:21.229846 4708 generic.go:334] "Generic (PLEG): container finished" podID="d4e9df92-5542-4a9e-8a92-44dd5286423f" containerID="05b518bfc1c6f70dc0452b73d35a9222f4c8b3d89de1a070a700bb1630575c98" exitCode=0 Mar 20 16:21:21 crc kubenswrapper[4708]: I0320 16:21:21.229916 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cb98m" event={"ID":"d4e9df92-5542-4a9e-8a92-44dd5286423f","Type":"ContainerDied","Data":"05b518bfc1c6f70dc0452b73d35a9222f4c8b3d89de1a070a700bb1630575c98"} Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.384279 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86b6976b49-4hvbm"] Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.416174 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6b6ff5cbbd-kjfxp"] Mar 20 16:21:22 crc kubenswrapper[4708]: E0320 16:21:22.417960 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e348306-307b-4b7f-b42b-27501b1d8d9a" containerName="init" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.418002 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e348306-307b-4b7f-b42b-27501b1d8d9a" containerName="init" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.418819 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e348306-307b-4b7f-b42b-27501b1d8d9a" containerName="init" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.421343 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.424063 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.434204 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b6ff5cbbd-kjfxp"] Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.505219 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-56c48d8cfc-q27gm"] Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.520131 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7bc9dd67b8-mz4lv"] Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.522038 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.533131 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15901de5-ddbe-4c7b-8968-8c614619be4d-config-data\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.533181 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15901de5-ddbe-4c7b-8968-8c614619be4d-scripts\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.533208 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15901de5-ddbe-4c7b-8968-8c614619be4d-combined-ca-bundle\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.533235 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-scripts\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.533254 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-horizon-tls-certs\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.546922 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/15901de5-ddbe-4c7b-8968-8c614619be4d-horizon-secret-key\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.547027 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-logs\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.547076 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15901de5-ddbe-4c7b-8968-8c614619be4d-logs\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.547225 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlpj7\" (UniqueName: \"kubernetes.io/projected/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-kube-api-access-xlpj7\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.547295 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-combined-ca-bundle\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.547360 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6zmg\" (UniqueName: \"kubernetes.io/projected/15901de5-ddbe-4c7b-8968-8c614619be4d-kube-api-access-h6zmg\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.547406 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/15901de5-ddbe-4c7b-8968-8c614619be4d-horizon-tls-certs\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.547464 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-config-data\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.547486 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-horizon-secret-key\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.554382 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bc9dd67b8-mz4lv"] Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649353 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/15901de5-ddbe-4c7b-8968-8c614619be4d-horizon-tls-certs\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649477 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-config-data\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649521 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-horizon-secret-key\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649540 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15901de5-ddbe-4c7b-8968-8c614619be4d-config-data\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649582 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15901de5-ddbe-4c7b-8968-8c614619be4d-scripts\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649607 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15901de5-ddbe-4c7b-8968-8c614619be4d-combined-ca-bundle\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649622 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-scripts\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649637 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-horizon-tls-certs\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649714 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/15901de5-ddbe-4c7b-8968-8c614619be4d-horizon-secret-key\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649749 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-logs\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649768 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15901de5-ddbe-4c7b-8968-8c614619be4d-logs\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649810 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlpj7\" (UniqueName: \"kubernetes.io/projected/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-kube-api-access-xlpj7\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649835 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-combined-ca-bundle\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.649862 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6zmg\" (UniqueName: \"kubernetes.io/projected/15901de5-ddbe-4c7b-8968-8c614619be4d-kube-api-access-h6zmg\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.650479 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15901de5-ddbe-4c7b-8968-8c614619be4d-scripts\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.654296 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-config-data\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.654606 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-logs\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.655214 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-scripts\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.659640 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15901de5-ddbe-4c7b-8968-8c614619be4d-logs\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.660646 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15901de5-ddbe-4c7b-8968-8c614619be4d-config-data\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.667137 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-horizon-secret-key\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.667236 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-combined-ca-bundle\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.668557 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15901de5-ddbe-4c7b-8968-8c614619be4d-combined-ca-bundle\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.671661 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/15901de5-ddbe-4c7b-8968-8c614619be4d-horizon-secret-key\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.672237 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/15901de5-ddbe-4c7b-8968-8c614619be4d-horizon-tls-certs\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.672981 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6zmg\" (UniqueName: \"kubernetes.io/projected/15901de5-ddbe-4c7b-8968-8c614619be4d-kube-api-access-h6zmg\") pod \"horizon-7bc9dd67b8-mz4lv\" (UID: \"15901de5-ddbe-4c7b-8968-8c614619be4d\") " pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.673330 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-horizon-tls-certs\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.676408 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlpj7\" (UniqueName: \"kubernetes.io/projected/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-kube-api-access-xlpj7\") pod \"horizon-6b6ff5cbbd-kjfxp\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.765571 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:22 crc kubenswrapper[4708]: I0320 16:21:22.865741 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.840756 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.875831 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmrt8\" (UniqueName: \"kubernetes.io/projected/d4e9df92-5542-4a9e-8a92-44dd5286423f-kube-api-access-dmrt8\") pod \"d4e9df92-5542-4a9e-8a92-44dd5286423f\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.875944 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-fernet-keys\") pod \"d4e9df92-5542-4a9e-8a92-44dd5286423f\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.876003 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-credential-keys\") pod \"d4e9df92-5542-4a9e-8a92-44dd5286423f\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.876088 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-combined-ca-bundle\") pod \"d4e9df92-5542-4a9e-8a92-44dd5286423f\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.876183 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-scripts\") pod \"d4e9df92-5542-4a9e-8a92-44dd5286423f\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.876338 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-config-data\") pod \"d4e9df92-5542-4a9e-8a92-44dd5286423f\" (UID: \"d4e9df92-5542-4a9e-8a92-44dd5286423f\") " Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.884177 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d4e9df92-5542-4a9e-8a92-44dd5286423f" (UID: "d4e9df92-5542-4a9e-8a92-44dd5286423f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.886046 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e9df92-5542-4a9e-8a92-44dd5286423f-kube-api-access-dmrt8" (OuterVolumeSpecName: "kube-api-access-dmrt8") pod "d4e9df92-5542-4a9e-8a92-44dd5286423f" (UID: "d4e9df92-5542-4a9e-8a92-44dd5286423f"). InnerVolumeSpecName "kube-api-access-dmrt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.886533 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-scripts" (OuterVolumeSpecName: "scripts") pod "d4e9df92-5542-4a9e-8a92-44dd5286423f" (UID: "d4e9df92-5542-4a9e-8a92-44dd5286423f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.898293 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d4e9df92-5542-4a9e-8a92-44dd5286423f" (UID: "d4e9df92-5542-4a9e-8a92-44dd5286423f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.909055 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4e9df92-5542-4a9e-8a92-44dd5286423f" (UID: "d4e9df92-5542-4a9e-8a92-44dd5286423f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.919308 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-config-data" (OuterVolumeSpecName: "config-data") pod "d4e9df92-5542-4a9e-8a92-44dd5286423f" (UID: "d4e9df92-5542-4a9e-8a92-44dd5286423f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.978794 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.978837 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmrt8\" (UniqueName: \"kubernetes.io/projected/d4e9df92-5542-4a9e-8a92-44dd5286423f-kube-api-access-dmrt8\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.978849 4708 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.978857 4708 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.978868 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:23 crc kubenswrapper[4708]: I0320 16:21:23.978876 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e9df92-5542-4a9e-8a92-44dd5286423f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:24 crc kubenswrapper[4708]: I0320 16:21:24.281322 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cb98m" event={"ID":"d4e9df92-5542-4a9e-8a92-44dd5286423f","Type":"ContainerDied","Data":"ca4a77910580b62a7b3739dc3a9ba2f47e87138c05acc00c08101c0c72ce90b6"} Mar 20 16:21:24 crc kubenswrapper[4708]: I0320 16:21:24.281360 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca4a77910580b62a7b3739dc3a9ba2f47e87138c05acc00c08101c0c72ce90b6" Mar 20 16:21:24 crc kubenswrapper[4708]: I0320 16:21:24.281375 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cb98m" Mar 20 16:21:24 crc kubenswrapper[4708]: I0320 16:21:24.939091 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-cb98m"] Mar 20 16:21:24 crc kubenswrapper[4708]: I0320 16:21:24.947343 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-cb98m"] Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.040966 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-q5htg"] Mar 20 16:21:25 crc kubenswrapper[4708]: E0320 16:21:25.041455 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e9df92-5542-4a9e-8a92-44dd5286423f" containerName="keystone-bootstrap" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.041471 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e9df92-5542-4a9e-8a92-44dd5286423f" containerName="keystone-bootstrap" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.041626 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e9df92-5542-4a9e-8a92-44dd5286423f" containerName="keystone-bootstrap" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.042364 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.046295 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.046651 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vtmgw" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.046651 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.046854 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.047052 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.056275 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q5htg"] Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.099809 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-scripts\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.099888 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-fernet-keys\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.099945 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-credential-keys\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.100008 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q4k7\" (UniqueName: \"kubernetes.io/projected/341b59c3-684f-45e4-9d42-ed258e0e671b-kube-api-access-6q4k7\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.100047 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-config-data\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.100109 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-combined-ca-bundle\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.164309 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.201190 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-fernet-keys\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.201289 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-credential-keys\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.202272 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q4k7\" (UniqueName: \"kubernetes.io/projected/341b59c3-684f-45e4-9d42-ed258e0e671b-kube-api-access-6q4k7\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.202311 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-config-data\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.202374 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-combined-ca-bundle\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.202449 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-scripts\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.209183 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-credential-keys\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.213005 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-combined-ca-bundle\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.216710 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-config-data\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.219621 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-scripts\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.226756 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-fernet-keys\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.244682 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q4k7\" (UniqueName: \"kubernetes.io/projected/341b59c3-684f-45e4-9d42-ed258e0e671b-kube-api-access-6q4k7\") pod \"keystone-bootstrap-q5htg\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.247186 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhwvr"] Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.247835 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-qhwvr" podUID="3dc07138-7824-4a54-957c-52e9dd4120ac" containerName="dnsmasq-dns" containerID="cri-o://864ab416fb5ece0fa3759dfb52b714e9a2d4af6178734457b61c131fb529c35a" gracePeriod=10 Mar 20 16:21:25 crc kubenswrapper[4708]: I0320 16:21:25.371786 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:26 crc kubenswrapper[4708]: I0320 16:21:26.131351 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e9df92-5542-4a9e-8a92-44dd5286423f" path="/var/lib/kubelet/pods/d4e9df92-5542-4a9e-8a92-44dd5286423f/volumes" Mar 20 16:21:26 crc kubenswrapper[4708]: I0320 16:21:26.326543 4708 generic.go:334] "Generic (PLEG): container finished" podID="3dc07138-7824-4a54-957c-52e9dd4120ac" containerID="864ab416fb5ece0fa3759dfb52b714e9a2d4af6178734457b61c131fb529c35a" exitCode=0 Mar 20 16:21:26 crc kubenswrapper[4708]: I0320 16:21:26.326601 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhwvr" event={"ID":"3dc07138-7824-4a54-957c-52e9dd4120ac","Type":"ContainerDied","Data":"864ab416fb5ece0fa3759dfb52b714e9a2d4af6178734457b61c131fb529c35a"} Mar 20 16:21:29 crc kubenswrapper[4708]: I0320 16:21:29.762926 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-qhwvr" podUID="3dc07138-7824-4a54-957c-52e9dd4120ac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 20 16:21:34 crc kubenswrapper[4708]: E0320 16:21:34.353385 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 16:21:34 crc kubenswrapper[4708]: E0320 16:21:34.354259 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n679hd6h66bh65h555hch556h558h598h645h555h565h97h66dh97hd8h5fbh5chcfh554h79h556h54h665h56fh8h64h557h57fh5bch546h649q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wbz2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-68f5b8d549-7854k_openstack(0f5b840a-bd1e-4e3e-b970-c531e84a22ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:21:34 crc kubenswrapper[4708]: E0320 16:21:34.363959 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-68f5b8d549-7854k" podUID="0f5b840a-bd1e-4e3e-b970-c531e84a22ff" Mar 20 16:21:34 crc kubenswrapper[4708]: E0320 16:21:34.671050 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-swift-account:current-podified" Mar 20 16:21:34 crc kubenswrapper[4708]: E0320 16:21:34.671204 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:account-server,Image:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,Command:[/usr/bin/swift-account-server /etc/swift/account-server.conf.d -v],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:account,HostPort:0,ContainerPort:6202,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n669h647h697h557h694h5ffh5cdh5b9h59fh676h668h677h5f9h587hd5h5fdh59hd4h665h578h56bh65dh64ch8bh647h5dfhfch576h5fdh5b9h85h697q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:swift,ReadOnly:false,MountPath:/srv/node/pv,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-swift,ReadOnly:false,MountPath:/etc/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cache,ReadOnly:false,MountPath:/var/cache/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lock,ReadOnly:false,MountPath:/var/lock,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mshfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42445,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-storage-0_openstack(b5b88259-a142-42a0-ab2c-bb0980ad9465): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:21:34 crc kubenswrapper[4708]: E0320 16:21:34.690301 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 16:21:34 crc kubenswrapper[4708]: E0320 16:21:34.690489 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n689h595h577h689h64fh66h54fh58fh586h59h5ddh5d4h686h679h68chf5h5f4h575hffh566h5d5h55ch7bh5cbh67bhc6h54dh5bdh587h5d6h558h59fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xrm2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-86b6976b49-4hvbm_openstack(91b42c5b-3a13-4405-b679-546a26c2e78e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:21:34 crc kubenswrapper[4708]: E0320 16:21:34.693033 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-86b6976b49-4hvbm" podUID="91b42c5b-3a13-4405-b679-546a26c2e78e" Mar 20 16:21:34 crc kubenswrapper[4708]: I0320 16:21:34.762118 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-qhwvr" podUID="3dc07138-7824-4a54-957c-52e9dd4120ac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 20 16:21:35 crc kubenswrapper[4708]: E0320 16:21:35.029090 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 20 16:21:35 crc kubenswrapper[4708]: E0320 16:21:35.029255 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n695h669h669h579h5bbh56dh566h679h79h588h66hc6h64ch68dhbh669h5b6hb9h675h56fhbdhfh685h588h5c6h5dfh5dbh5h56bh687h549h665q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kr97h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(5121a54a-778a-4b46-9726-a4ba2901042b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:21:41 crc kubenswrapper[4708]: I0320 16:21:41.444924 4708 generic.go:334] "Generic (PLEG): container finished" podID="e73e6a53-ccd4-45bf-ad96-a6de1e696888" containerID="26b6e59e594f71b347641897b67adbb4240aece918c6018f1e81ff0c16b097cd" exitCode=0 Mar 20 16:21:41 crc kubenswrapper[4708]: I0320 16:21:41.445417 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rpnp8" event={"ID":"e73e6a53-ccd4-45bf-ad96-a6de1e696888","Type":"ContainerDied","Data":"26b6e59e594f71b347641897b67adbb4240aece918c6018f1e81ff0c16b097cd"} Mar 20 16:21:42 crc kubenswrapper[4708]: E0320 16:21:42.122760 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 16:21:42 crc kubenswrapper[4708]: E0320 16:21:42.122944 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-brxlv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-zk9cx_openstack(204070bf-f103-49d9-b366-185454e68b9e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:21:42 crc kubenswrapper[4708]: E0320 16:21:42.124195 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-zk9cx" podUID="204070bf-f103-49d9-b366-185454e68b9e" Mar 20 16:21:42 crc kubenswrapper[4708]: E0320 16:21:42.141155 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 16:21:42 crc kubenswrapper[4708]: E0320 16:21:42.141337 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n57ch664h5c7h59dh7dh76h674hbch5fch87h5b8hddhf7h658h9bh596h96h569h66h7fh599hf9h8dh7fh7dhc9h56fh66fhb5h5bbh9dhc9q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qnbf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-56c48d8cfc-q27gm_openstack(73f2d616-aca1-4db4-890a-e6b02eaa9f4a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:21:42 crc kubenswrapper[4708]: E0320 16:21:42.143715 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-56c48d8cfc-q27gm" podUID="73f2d616-aca1-4db4-890a-e6b02eaa9f4a" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.243535 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.256912 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.259534 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.436750 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrm2f\" (UniqueName: \"kubernetes.io/projected/91b42c5b-3a13-4405-b679-546a26c2e78e-kube-api-access-xrm2f\") pod \"91b42c5b-3a13-4405-b679-546a26c2e78e\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.436843 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-horizon-secret-key\") pod \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.436880 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91b42c5b-3a13-4405-b679-546a26c2e78e-config-data\") pod \"91b42c5b-3a13-4405-b679-546a26c2e78e\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.436905 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91b42c5b-3a13-4405-b679-546a26c2e78e-horizon-secret-key\") pod \"91b42c5b-3a13-4405-b679-546a26c2e78e\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.437001 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-config\") pod \"3dc07138-7824-4a54-957c-52e9dd4120ac\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.437045 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-dns-svc\") pod \"3dc07138-7824-4a54-957c-52e9dd4120ac\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.437073 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-logs\") pod \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.437096 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-nb\") pod \"3dc07138-7824-4a54-957c-52e9dd4120ac\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.437179 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz89f\" (UniqueName: \"kubernetes.io/projected/3dc07138-7824-4a54-957c-52e9dd4120ac-kube-api-access-hz89f\") pod \"3dc07138-7824-4a54-957c-52e9dd4120ac\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.437244 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-config-data\") pod \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.437344 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91b42c5b-3a13-4405-b679-546a26c2e78e-scripts\") pod \"91b42c5b-3a13-4405-b679-546a26c2e78e\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.437371 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-sb\") pod \"3dc07138-7824-4a54-957c-52e9dd4120ac\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.437436 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbz2h\" (UniqueName: \"kubernetes.io/projected/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-kube-api-access-wbz2h\") pod \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.437463 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91b42c5b-3a13-4405-b679-546a26c2e78e-logs\") pod \"91b42c5b-3a13-4405-b679-546a26c2e78e\" (UID: \"91b42c5b-3a13-4405-b679-546a26c2e78e\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.437491 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-scripts\") pod \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\" (UID: \"0f5b840a-bd1e-4e3e-b970-c531e84a22ff\") " Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.438319 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91b42c5b-3a13-4405-b679-546a26c2e78e-scripts" (OuterVolumeSpecName: "scripts") pod "91b42c5b-3a13-4405-b679-546a26c2e78e" (UID: "91b42c5b-3a13-4405-b679-546a26c2e78e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.438729 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91b42c5b-3a13-4405-b679-546a26c2e78e-logs" (OuterVolumeSpecName: "logs") pod "91b42c5b-3a13-4405-b679-546a26c2e78e" (UID: "91b42c5b-3a13-4405-b679-546a26c2e78e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.439206 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-logs" (OuterVolumeSpecName: "logs") pod "0f5b840a-bd1e-4e3e-b970-c531e84a22ff" (UID: "0f5b840a-bd1e-4e3e-b970-c531e84a22ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.439620 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-scripts" (OuterVolumeSpecName: "scripts") pod "0f5b840a-bd1e-4e3e-b970-c531e84a22ff" (UID: "0f5b840a-bd1e-4e3e-b970-c531e84a22ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.439753 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-config-data" (OuterVolumeSpecName: "config-data") pod "0f5b840a-bd1e-4e3e-b970-c531e84a22ff" (UID: "0f5b840a-bd1e-4e3e-b970-c531e84a22ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.440326 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.440356 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.440372 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91b42c5b-3a13-4405-b679-546a26c2e78e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.440384 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91b42c5b-3a13-4405-b679-546a26c2e78e-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.440398 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.440509 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91b42c5b-3a13-4405-b679-546a26c2e78e-config-data" (OuterVolumeSpecName: "config-data") pod "91b42c5b-3a13-4405-b679-546a26c2e78e" (UID: "91b42c5b-3a13-4405-b679-546a26c2e78e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.462761 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86b6976b49-4hvbm" event={"ID":"91b42c5b-3a13-4405-b679-546a26c2e78e","Type":"ContainerDied","Data":"31f41f615ec9714c586b7a6d19d35e45b863b9b593b2e84ad25aed8dd7f0f0a9"} Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.462903 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86b6976b49-4hvbm" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.463584 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-kube-api-access-wbz2h" (OuterVolumeSpecName: "kube-api-access-wbz2h") pod "0f5b840a-bd1e-4e3e-b970-c531e84a22ff" (UID: "0f5b840a-bd1e-4e3e-b970-c531e84a22ff"). InnerVolumeSpecName "kube-api-access-wbz2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.464250 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc07138-7824-4a54-957c-52e9dd4120ac-kube-api-access-hz89f" (OuterVolumeSpecName: "kube-api-access-hz89f") pod "3dc07138-7824-4a54-957c-52e9dd4120ac" (UID: "3dc07138-7824-4a54-957c-52e9dd4120ac"). InnerVolumeSpecName "kube-api-access-hz89f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.464329 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91b42c5b-3a13-4405-b679-546a26c2e78e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "91b42c5b-3a13-4405-b679-546a26c2e78e" (UID: "91b42c5b-3a13-4405-b679-546a26c2e78e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.464996 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0f5b840a-bd1e-4e3e-b970-c531e84a22ff" (UID: "0f5b840a-bd1e-4e3e-b970-c531e84a22ff"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.466975 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68f5b8d549-7854k" event={"ID":"0f5b840a-bd1e-4e3e-b970-c531e84a22ff","Type":"ContainerDied","Data":"4c32c46dbc8e9d2135302903295706ee8698675219b303b0a475b262fea671be"} Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.467095 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68f5b8d549-7854k" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.485977 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91b42c5b-3a13-4405-b679-546a26c2e78e-kube-api-access-xrm2f" (OuterVolumeSpecName: "kube-api-access-xrm2f") pod "91b42c5b-3a13-4405-b679-546a26c2e78e" (UID: "91b42c5b-3a13-4405-b679-546a26c2e78e"). InnerVolumeSpecName "kube-api-access-xrm2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.497232 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-config" (OuterVolumeSpecName: "config") pod "3dc07138-7824-4a54-957c-52e9dd4120ac" (UID: "3dc07138-7824-4a54-957c-52e9dd4120ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.498904 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3dc07138-7824-4a54-957c-52e9dd4120ac" (UID: "3dc07138-7824-4a54-957c-52e9dd4120ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: E0320 16:21:42.504388 4708 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-sb podName:3dc07138-7824-4a54-957c-52e9dd4120ac nodeName:}" failed. No retries permitted until 2026-03-20 16:21:43.004359483 +0000 UTC m=+1257.678696198 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-sb" (UniqueName: "kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-sb") pod "3dc07138-7824-4a54-957c-52e9dd4120ac" (UID: "3dc07138-7824-4a54-957c-52e9dd4120ac") : error deleting /var/lib/kubelet/pods/3dc07138-7824-4a54-957c-52e9dd4120ac/volume-subpaths: remove /var/lib/kubelet/pods/3dc07138-7824-4a54-957c-52e9dd4120ac/volume-subpaths: no such file or directory Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.504803 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3dc07138-7824-4a54-957c-52e9dd4120ac" (UID: "3dc07138-7824-4a54-957c-52e9dd4120ac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.504889 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qhwvr" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.504929 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhwvr" event={"ID":"3dc07138-7824-4a54-957c-52e9dd4120ac","Type":"ContainerDied","Data":"7ff5033d3c65101149ec8da4d40eb9c6ff7d2759b09aadb0674847c1c30afad6"} Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.504987 4708 scope.go:117] "RemoveContainer" containerID="864ab416fb5ece0fa3759dfb52b714e9a2d4af6178734457b61c131fb529c35a" Mar 20 16:21:42 crc kubenswrapper[4708]: E0320 16:21:42.510305 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-zk9cx" podUID="204070bf-f103-49d9-b366-185454e68b9e" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.541969 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrm2f\" (UniqueName: \"kubernetes.io/projected/91b42c5b-3a13-4405-b679-546a26c2e78e-kube-api-access-xrm2f\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.542040 4708 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.542053 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91b42c5b-3a13-4405-b679-546a26c2e78e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.542092 4708 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91b42c5b-3a13-4405-b679-546a26c2e78e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.542106 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.542117 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.542128 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.542138 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz89f\" (UniqueName: \"kubernetes.io/projected/3dc07138-7824-4a54-957c-52e9dd4120ac-kube-api-access-hz89f\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.542150 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbz2h\" (UniqueName: \"kubernetes.io/projected/0f5b840a-bd1e-4e3e-b970-c531e84a22ff-kube-api-access-wbz2h\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.571244 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b6ff5cbbd-kjfxp"] Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.642577 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68f5b8d549-7854k"] Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.649837 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-68f5b8d549-7854k"] Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.820183 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86b6976b49-4hvbm"] Mar 20 16:21:42 crc kubenswrapper[4708]: I0320 16:21:42.828124 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86b6976b49-4hvbm"] Mar 20 16:21:42 crc kubenswrapper[4708]: E0320 16:21:42.924128 4708 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91b42c5b_3a13_4405_b679_546a26c2e78e.slice\": RecentStats: unable to find data in memory cache]" Mar 20 16:21:43 crc kubenswrapper[4708]: I0320 16:21:43.053065 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-sb\") pod \"3dc07138-7824-4a54-957c-52e9dd4120ac\" (UID: \"3dc07138-7824-4a54-957c-52e9dd4120ac\") " Mar 20 16:21:43 crc kubenswrapper[4708]: I0320 16:21:43.053523 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3dc07138-7824-4a54-957c-52e9dd4120ac" (UID: "3dc07138-7824-4a54-957c-52e9dd4120ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:43 crc kubenswrapper[4708]: I0320 16:21:43.054065 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc07138-7824-4a54-957c-52e9dd4120ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:43 crc kubenswrapper[4708]: I0320 16:21:43.141750 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhwvr"] Mar 20 16:21:43 crc kubenswrapper[4708]: I0320 16:21:43.156697 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhwvr"] Mar 20 16:21:44 crc kubenswrapper[4708]: I0320 16:21:44.120347 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f5b840a-bd1e-4e3e-b970-c531e84a22ff" path="/var/lib/kubelet/pods/0f5b840a-bd1e-4e3e-b970-c531e84a22ff/volumes" Mar 20 16:21:44 crc kubenswrapper[4708]: I0320 16:21:44.121139 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc07138-7824-4a54-957c-52e9dd4120ac" path="/var/lib/kubelet/pods/3dc07138-7824-4a54-957c-52e9dd4120ac/volumes" Mar 20 16:21:44 crc kubenswrapper[4708]: I0320 16:21:44.121992 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91b42c5b-3a13-4405-b679-546a26c2e78e" path="/var/lib/kubelet/pods/91b42c5b-3a13-4405-b679-546a26c2e78e/volumes" Mar 20 16:21:44 crc kubenswrapper[4708]: I0320 16:21:44.762434 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-qhwvr" podUID="3dc07138-7824-4a54-957c-52e9dd4120ac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Mar 20 16:21:46 crc kubenswrapper[4708]: W0320 16:21:46.284553 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48bfe1e4_0a34_4af1_badd_c445d2c02ce1.slice/crio-e08be265e3cb151fb776664f40c15117f7fa813cab9614a156275362d8f4bc6c WatchSource:0}: Error finding container e08be265e3cb151fb776664f40c15117f7fa813cab9614a156275362d8f4bc6c: Status 404 returned error can't find the container with id e08be265e3cb151fb776664f40c15117f7fa813cab9614a156275362d8f4bc6c Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.316483 4708 scope.go:117] "RemoveContainer" containerID="55c5f797db2a66deea35d687a66aef76774721fa4fbf12e519b6397bb7b2ffb7" Mar 20 16:21:46 crc kubenswrapper[4708]: E0320 16:21:46.504060 4708 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 16:21:46 crc kubenswrapper[4708]: E0320 16:21:46.504480 4708 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nzb9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-fgjlj_openstack(c46a759f-98ab-495d-9cab-ba1f2fbbb112): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 16:21:46 crc kubenswrapper[4708]: E0320 16:21:46.505791 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-fgjlj" podUID="c46a759f-98ab-495d-9cab-ba1f2fbbb112" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.516915 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.521582 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-config-data\") pod \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.521710 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnbf6\" (UniqueName: \"kubernetes.io/projected/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-kube-api-access-qnbf6\") pod \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.521764 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-scripts\") pod \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.521815 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-horizon-secret-key\") pod \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.521847 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-logs\") pod \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\" (UID: \"73f2d616-aca1-4db4-890a-e6b02eaa9f4a\") " Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.522263 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-scripts" (OuterVolumeSpecName: "scripts") pod "73f2d616-aca1-4db4-890a-e6b02eaa9f4a" (UID: "73f2d616-aca1-4db4-890a-e6b02eaa9f4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.522570 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-config-data" (OuterVolumeSpecName: "config-data") pod "73f2d616-aca1-4db4-890a-e6b02eaa9f4a" (UID: "73f2d616-aca1-4db4-890a-e6b02eaa9f4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.522603 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-logs" (OuterVolumeSpecName: "logs") pod "73f2d616-aca1-4db4-890a-e6b02eaa9f4a" (UID: "73f2d616-aca1-4db4-890a-e6b02eaa9f4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.528971 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "73f2d616-aca1-4db4-890a-e6b02eaa9f4a" (UID: "73f2d616-aca1-4db4-890a-e6b02eaa9f4a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.530928 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-kube-api-access-qnbf6" (OuterVolumeSpecName: "kube-api-access-qnbf6") pod "73f2d616-aca1-4db4-890a-e6b02eaa9f4a" (UID: "73f2d616-aca1-4db4-890a-e6b02eaa9f4a"). InnerVolumeSpecName "kube-api-access-qnbf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.541444 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56c48d8cfc-q27gm" event={"ID":"73f2d616-aca1-4db4-890a-e6b02eaa9f4a","Type":"ContainerDied","Data":"d98f7c935b937e6c51a555193c4238aa31edfec97324d1e45fa5ebb635215fba"} Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.541539 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56c48d8cfc-q27gm" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.544729 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rpnp8" event={"ID":"e73e6a53-ccd4-45bf-ad96-a6de1e696888","Type":"ContainerDied","Data":"eff40708df8220974aab932d0dba72e5682c7cc92ee1fc0117de989f1113dae5"} Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.544770 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eff40708df8220974aab932d0dba72e5682c7cc92ee1fc0117de989f1113dae5" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.553600 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b6ff5cbbd-kjfxp" event={"ID":"48bfe1e4-0a34-4af1-badd-c445d2c02ce1","Type":"ContainerStarted","Data":"e08be265e3cb151fb776664f40c15117f7fa813cab9614a156275362d8f4bc6c"} Mar 20 16:21:46 crc kubenswrapper[4708]: E0320 16:21:46.556623 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-fgjlj" podUID="c46a759f-98ab-495d-9cab-ba1f2fbbb112" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.594371 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rpnp8" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.624402 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.624781 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnbf6\" (UniqueName: \"kubernetes.io/projected/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-kube-api-access-qnbf6\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.625075 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.625778 4708 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.625859 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73f2d616-aca1-4db4-890a-e6b02eaa9f4a-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.712293 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-56c48d8cfc-q27gm"] Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.726892 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jkq7\" (UniqueName: \"kubernetes.io/projected/e73e6a53-ccd4-45bf-ad96-a6de1e696888-kube-api-access-4jkq7\") pod \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.726949 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-config-data\") pod \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.727005 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-combined-ca-bundle\") pod \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.727034 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-db-sync-config-data\") pod \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\" (UID: \"e73e6a53-ccd4-45bf-ad96-a6de1e696888\") " Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.728310 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-56c48d8cfc-q27gm"] Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.731648 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e73e6a53-ccd4-45bf-ad96-a6de1e696888-kube-api-access-4jkq7" (OuterVolumeSpecName: "kube-api-access-4jkq7") pod "e73e6a53-ccd4-45bf-ad96-a6de1e696888" (UID: "e73e6a53-ccd4-45bf-ad96-a6de1e696888"). InnerVolumeSpecName "kube-api-access-4jkq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.733027 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e73e6a53-ccd4-45bf-ad96-a6de1e696888" (UID: "e73e6a53-ccd4-45bf-ad96-a6de1e696888"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.755480 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e73e6a53-ccd4-45bf-ad96-a6de1e696888" (UID: "e73e6a53-ccd4-45bf-ad96-a6de1e696888"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.781539 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-config-data" (OuterVolumeSpecName: "config-data") pod "e73e6a53-ccd4-45bf-ad96-a6de1e696888" (UID: "e73e6a53-ccd4-45bf-ad96-a6de1e696888"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.787357 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bc9dd67b8-mz4lv"] Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.828982 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.829014 4708 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.829027 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jkq7\" (UniqueName: \"kubernetes.io/projected/e73e6a53-ccd4-45bf-ad96-a6de1e696888-kube-api-access-4jkq7\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.829039 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e73e6a53-ccd4-45bf-ad96-a6de1e696888-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:46 crc kubenswrapper[4708]: I0320 16:21:46.858830 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q5htg"] Mar 20 16:21:46 crc kubenswrapper[4708]: W0320 16:21:46.903819 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod341b59c3_684f_45e4_9d42_ed258e0e671b.slice/crio-625a79071f0232d857484ce3fb76703b76a28fb2388c0fd506d5fd532b79307e WatchSource:0}: Error finding container 625a79071f0232d857484ce3fb76703b76a28fb2388c0fd506d5fd532b79307e: Status 404 returned error can't find the container with id 625a79071f0232d857484ce3fb76703b76a28fb2388c0fd506d5fd532b79307e Mar 20 16:21:46 crc kubenswrapper[4708]: W0320 16:21:46.906594 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15901de5_ddbe_4c7b_8968_8c614619be4d.slice/crio-e1bb97b5159ccdb62657fb1c05ef0b3f9cd4f412d7854232fcaf318999e2f698 WatchSource:0}: Error finding container e1bb97b5159ccdb62657fb1c05ef0b3f9cd4f412d7854232fcaf318999e2f698: Status 404 returned error can't find the container with id e1bb97b5159ccdb62657fb1c05ef0b3f9cd4f412d7854232fcaf318999e2f698 Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.099720 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.562012 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bc9dd67b8-mz4lv" event={"ID":"15901de5-ddbe-4c7b-8968-8c614619be4d","Type":"ContainerStarted","Data":"e1bb97b5159ccdb62657fb1c05ef0b3f9cd4f412d7854232fcaf318999e2f698"} Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.566979 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5htg" event={"ID":"341b59c3-684f-45e4-9d42-ed258e0e671b","Type":"ContainerStarted","Data":"c26bc67a464e71977a18197777127053a8d65ea3985cde8cb4f1829105e1bc4e"} Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.567034 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5htg" event={"ID":"341b59c3-684f-45e4-9d42-ed258e0e671b","Type":"ContainerStarted","Data":"625a79071f0232d857484ce3fb76703b76a28fb2388c0fd506d5fd532b79307e"} Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.571262 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7q7vc" event={"ID":"3ded2837-c536-490b-a13c-2a09ea07a7aa","Type":"ContainerStarted","Data":"b4456b51d786af4b0433c292a77b61ade2e20ded4c15e1d30615eb8b6fcee510"} Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.574464 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b6ff5cbbd-kjfxp" event={"ID":"48bfe1e4-0a34-4af1-badd-c445d2c02ce1","Type":"ContainerStarted","Data":"79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e"} Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.577185 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5121a54a-778a-4b46-9726-a4ba2901042b","Type":"ContainerStarted","Data":"9bfe310440d497465aa1c16e3fb8dae60cfad012896e7485f69d1e1aa9a863a6"} Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.579119 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"55a7b23462a3d8c45a03f606157e26c687b28a72b2e88381b9150b120c2707f0"} Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.579166 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rpnp8" Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.591117 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-q5htg" podStartSLOduration=22.591097979 podStartE2EDuration="22.591097979s" podCreationTimestamp="2026-03-20 16:21:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:47.585089344 +0000 UTC m=+1262.259426059" watchObservedRunningTime="2026-03-20 16:21:47.591097979 +0000 UTC m=+1262.265434694" Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.602415 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-7q7vc" podStartSLOduration=7.304340417 podStartE2EDuration="33.602396818s" podCreationTimestamp="2026-03-20 16:21:14 +0000 UTC" firstStartedPulling="2026-03-20 16:21:15.817791635 +0000 UTC m=+1230.492128350" lastFinishedPulling="2026-03-20 16:21:42.115848036 +0000 UTC m=+1256.790184751" observedRunningTime="2026-03-20 16:21:47.600534818 +0000 UTC m=+1262.274871533" watchObservedRunningTime="2026-03-20 16:21:47.602396818 +0000 UTC m=+1262.276733533" Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.942973 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-lnqzp"] Mar 20 16:21:47 crc kubenswrapper[4708]: E0320 16:21:47.943830 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e73e6a53-ccd4-45bf-ad96-a6de1e696888" containerName="glance-db-sync" Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.943851 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="e73e6a53-ccd4-45bf-ad96-a6de1e696888" containerName="glance-db-sync" Mar 20 16:21:47 crc kubenswrapper[4708]: E0320 16:21:47.943872 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc07138-7824-4a54-957c-52e9dd4120ac" containerName="dnsmasq-dns" Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.943880 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc07138-7824-4a54-957c-52e9dd4120ac" containerName="dnsmasq-dns" Mar 20 16:21:47 crc kubenswrapper[4708]: E0320 16:21:47.943901 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc07138-7824-4a54-957c-52e9dd4120ac" containerName="init" Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.943909 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc07138-7824-4a54-957c-52e9dd4120ac" containerName="init" Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.944092 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc07138-7824-4a54-957c-52e9dd4120ac" containerName="dnsmasq-dns" Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.944113 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="e73e6a53-ccd4-45bf-ad96-a6de1e696888" containerName="glance-db-sync" Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.945199 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:47 crc kubenswrapper[4708]: I0320 16:21:47.975069 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-lnqzp"] Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.063390 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.063450 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-dns-svc\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.063788 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.063887 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-config\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.063961 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltd4z\" (UniqueName: \"kubernetes.io/projected/73f0ea77-b504-4e96-b463-f1b69dcefc6b-kube-api-access-ltd4z\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.137439 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f2d616-aca1-4db4-890a-e6b02eaa9f4a" path="/var/lib/kubelet/pods/73f2d616-aca1-4db4-890a-e6b02eaa9f4a/volumes" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.167857 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.169056 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-dns-svc\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.169313 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.170023 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-config\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.170153 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltd4z\" (UniqueName: \"kubernetes.io/projected/73f0ea77-b504-4e96-b463-f1b69dcefc6b-kube-api-access-ltd4z\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.169010 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.170721 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-dns-svc\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.185788 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-config\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.186419 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.203952 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltd4z\" (UniqueName: \"kubernetes.io/projected/73f0ea77-b504-4e96-b463-f1b69dcefc6b-kube-api-access-ltd4z\") pod \"dnsmasq-dns-f84976bdf-lnqzp\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.287515 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.615899 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"d0a5ecb2140ec3645e14fb7186cc595544f13fb5bce6b41997179227f7137391"} Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.616258 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"734e599c62285d8c1d9310e3cf37a8fa14e56695b7b65aeb2a400ae1a4f7425e"} Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.616274 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"5c83c6e9f32e7d8946d2203cdbf1bb9c083c9c70da141067688644d02bfc79a4"} Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.631940 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bc9dd67b8-mz4lv" event={"ID":"15901de5-ddbe-4c7b-8968-8c614619be4d","Type":"ContainerStarted","Data":"0bad11576bfce0249830ee1bc828d7ce1834f90f89b9f438e23543b88b848776"} Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.631991 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bc9dd67b8-mz4lv" event={"ID":"15901de5-ddbe-4c7b-8968-8c614619be4d","Type":"ContainerStarted","Data":"a003206c474c1b5b0a919684398bed68bf261a0e82415b0ccc991ace72d6b3a4"} Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.643971 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b6ff5cbbd-kjfxp" event={"ID":"48bfe1e4-0a34-4af1-badd-c445d2c02ce1","Type":"ContainerStarted","Data":"ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c"} Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.674850 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7bc9dd67b8-mz4lv" podStartSLOduration=26.254170362 podStartE2EDuration="26.674830348s" podCreationTimestamp="2026-03-20 16:21:22 +0000 UTC" firstStartedPulling="2026-03-20 16:21:46.908736908 +0000 UTC m=+1261.583073623" lastFinishedPulling="2026-03-20 16:21:47.329396894 +0000 UTC m=+1262.003733609" observedRunningTime="2026-03-20 16:21:48.658533791 +0000 UTC m=+1263.332870526" watchObservedRunningTime="2026-03-20 16:21:48.674830348 +0000 UTC m=+1263.349167053" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.695499 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6b6ff5cbbd-kjfxp" podStartSLOduration=25.778745326 podStartE2EDuration="26.695477043s" podCreationTimestamp="2026-03-20 16:21:22 +0000 UTC" firstStartedPulling="2026-03-20 16:21:46.287761318 +0000 UTC m=+1260.962098043" lastFinishedPulling="2026-03-20 16:21:47.204493045 +0000 UTC m=+1261.878829760" observedRunningTime="2026-03-20 16:21:48.690585769 +0000 UTC m=+1263.364922504" watchObservedRunningTime="2026-03-20 16:21:48.695477043 +0000 UTC m=+1263.369813758" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.844749 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-lnqzp"] Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.858270 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.861375 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.865009 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.868528 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.870022 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fkxnk" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.871265 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.993570 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.993981 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.994010 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff22ac59-4905-4ddf-975d-2047f67dc4be-logs\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.994034 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff22ac59-4905-4ddf-975d-2047f67dc4be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.994074 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.994105 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:48 crc kubenswrapper[4708]: I0320 16:21:48.994124 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bglhl\" (UniqueName: \"kubernetes.io/projected/ff22ac59-4905-4ddf-975d-2047f67dc4be-kube-api-access-bglhl\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.096113 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.096185 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.096205 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bglhl\" (UniqueName: \"kubernetes.io/projected/ff22ac59-4905-4ddf-975d-2047f67dc4be-kube-api-access-bglhl\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.096250 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.096309 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.096338 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff22ac59-4905-4ddf-975d-2047f67dc4be-logs\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.096357 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff22ac59-4905-4ddf-975d-2047f67dc4be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.097004 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff22ac59-4905-4ddf-975d-2047f67dc4be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.097379 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.098268 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.100008 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.101042 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff22ac59-4905-4ddf-975d-2047f67dc4be-logs\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.104001 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.104581 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.104581 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.108534 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.136551 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.142644 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bglhl\" (UniqueName: \"kubernetes.io/projected/ff22ac59-4905-4ddf-975d-2047f67dc4be-kube-api-access-bglhl\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.171130 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.198062 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.198204 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.198241 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c6xp\" (UniqueName: \"kubernetes.io/projected/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-kube-api-access-4c6xp\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.198291 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.198363 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.198448 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.198479 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.300535 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.300645 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.300694 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.300759 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.300812 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.300831 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c6xp\" (UniqueName: \"kubernetes.io/projected/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-kube-api-access-4c6xp\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.300858 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.301474 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.306948 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.307090 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.308480 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.311760 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.322727 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.332447 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c6xp\" (UniqueName: \"kubernetes.io/projected/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-kube-api-access-4c6xp\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.342580 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.360176 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.421155 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.697017 4708 generic.go:334] "Generic (PLEG): container finished" podID="73f0ea77-b504-4e96-b463-f1b69dcefc6b" containerID="c952ff4e08fd4025465d973eef1a8368a30c1a8187e9b0c465cb890ebc07041b" exitCode=0 Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.698360 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" event={"ID":"73f0ea77-b504-4e96-b463-f1b69dcefc6b","Type":"ContainerDied","Data":"c952ff4e08fd4025465d973eef1a8368a30c1a8187e9b0c465cb890ebc07041b"} Mar 20 16:21:49 crc kubenswrapper[4708]: I0320 16:21:49.698393 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" event={"ID":"73f0ea77-b504-4e96-b463-f1b69dcefc6b","Type":"ContainerStarted","Data":"996731fe59ce7ff24948ab9e2a723ac39d3215613a7de9b4310921f4f7a51a07"} Mar 20 16:21:50 crc kubenswrapper[4708]: I0320 16:21:50.156936 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:21:50 crc kubenswrapper[4708]: I0320 16:21:50.242712 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:21:50 crc kubenswrapper[4708]: W0320 16:21:50.685152 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff22ac59_4905_4ddf_975d_2047f67dc4be.slice/crio-7a544d1fc6bf92656483462c57b17af491bc331140fbc212a5721218401205be WatchSource:0}: Error finding container 7a544d1fc6bf92656483462c57b17af491bc331140fbc212a5721218401205be: Status 404 returned error can't find the container with id 7a544d1fc6bf92656483462c57b17af491bc331140fbc212a5721218401205be Mar 20 16:21:50 crc kubenswrapper[4708]: W0320 16:21:50.686616 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d4f8abd_eb29_4556_ae80_3cfd4a3e6735.slice/crio-e2549dffe577024ab4017ccb53a8a5c19a3a7bc6e4b6710f4756270787ff852c WatchSource:0}: Error finding container e2549dffe577024ab4017ccb53a8a5c19a3a7bc6e4b6710f4756270787ff852c: Status 404 returned error can't find the container with id e2549dffe577024ab4017ccb53a8a5c19a3a7bc6e4b6710f4756270787ff852c Mar 20 16:21:50 crc kubenswrapper[4708]: I0320 16:21:50.713593 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" event={"ID":"73f0ea77-b504-4e96-b463-f1b69dcefc6b","Type":"ContainerStarted","Data":"0a9c68c23416d7c03f14ca198ef1c7e634267e31b639f9e72cc553c2f635fcc5"} Mar 20 16:21:50 crc kubenswrapper[4708]: I0320 16:21:50.714745 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:50 crc kubenswrapper[4708]: I0320 16:21:50.718104 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff22ac59-4905-4ddf-975d-2047f67dc4be","Type":"ContainerStarted","Data":"7a544d1fc6bf92656483462c57b17af491bc331140fbc212a5721218401205be"} Mar 20 16:21:50 crc kubenswrapper[4708]: I0320 16:21:50.720439 4708 generic.go:334] "Generic (PLEG): container finished" podID="3ded2837-c536-490b-a13c-2a09ea07a7aa" containerID="b4456b51d786af4b0433c292a77b61ade2e20ded4c15e1d30615eb8b6fcee510" exitCode=0 Mar 20 16:21:50 crc kubenswrapper[4708]: I0320 16:21:50.720561 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7q7vc" event={"ID":"3ded2837-c536-490b-a13c-2a09ea07a7aa","Type":"ContainerDied","Data":"b4456b51d786af4b0433c292a77b61ade2e20ded4c15e1d30615eb8b6fcee510"} Mar 20 16:21:50 crc kubenswrapper[4708]: I0320 16:21:50.744062 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735","Type":"ContainerStarted","Data":"e2549dffe577024ab4017ccb53a8a5c19a3a7bc6e4b6710f4756270787ff852c"} Mar 20 16:21:50 crc kubenswrapper[4708]: I0320 16:21:50.763428 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" podStartSLOduration=3.763407764 podStartE2EDuration="3.763407764s" podCreationTimestamp="2026-03-20 16:21:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:50.744487277 +0000 UTC m=+1265.418824002" watchObservedRunningTime="2026-03-20 16:21:50.763407764 +0000 UTC m=+1265.437744479" Mar 20 16:21:51 crc kubenswrapper[4708]: I0320 16:21:51.371773 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:21:51 crc kubenswrapper[4708]: I0320 16:21:51.477147 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:21:51 crc kubenswrapper[4708]: I0320 16:21:51.779220 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"15c44ad9defff7cafc193fc8f7879f737616fb4c3849fe462057758b43486f54"} Mar 20 16:21:52 crc kubenswrapper[4708]: I0320 16:21:52.766014 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:52 crc kubenswrapper[4708]: I0320 16:21:52.766063 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:21:52 crc kubenswrapper[4708]: I0320 16:21:52.792526 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff22ac59-4905-4ddf-975d-2047f67dc4be","Type":"ContainerStarted","Data":"fe02ac34b8493c1e2b6bfff4ab21a92688407369014185039b3f016745caea03"} Mar 20 16:21:52 crc kubenswrapper[4708]: I0320 16:21:52.796054 4708 generic.go:334] "Generic (PLEG): container finished" podID="341b59c3-684f-45e4-9d42-ed258e0e671b" containerID="c26bc67a464e71977a18197777127053a8d65ea3985cde8cb4f1829105e1bc4e" exitCode=0 Mar 20 16:21:52 crc kubenswrapper[4708]: I0320 16:21:52.796108 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5htg" event={"ID":"341b59c3-684f-45e4-9d42-ed258e0e671b","Type":"ContainerDied","Data":"c26bc67a464e71977a18197777127053a8d65ea3985cde8cb4f1829105e1bc4e"} Mar 20 16:21:52 crc kubenswrapper[4708]: I0320 16:21:52.800910 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735","Type":"ContainerStarted","Data":"13f8f5fe5a6eca2dbd0a42822bc6e8579ee26b3846c842eaa48eed01f89e9ec5"} Mar 20 16:21:52 crc kubenswrapper[4708]: I0320 16:21:52.803326 4708 generic.go:334] "Generic (PLEG): container finished" podID="52e4d34b-0c95-475c-b9e5-be1dff27d5a3" containerID="612769ba32a5de6e87931c0a149f996bebf73631f256e8394b11aa927a7f24f1" exitCode=0 Mar 20 16:21:52 crc kubenswrapper[4708]: I0320 16:21:52.803382 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sr6vd" event={"ID":"52e4d34b-0c95-475c-b9e5-be1dff27d5a3","Type":"ContainerDied","Data":"612769ba32a5de6e87931c0a149f996bebf73631f256e8394b11aa927a7f24f1"} Mar 20 16:21:52 crc kubenswrapper[4708]: I0320 16:21:52.814209 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"97d9b990f2aa51a27c9cfa8f2f1cedfff503c00eb07df89a63c66dedb48f09d9"} Mar 20 16:21:52 crc kubenswrapper[4708]: I0320 16:21:52.866335 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:52 crc kubenswrapper[4708]: I0320 16:21:52.866407 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.431252 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.513355 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-config-data\") pod \"3ded2837-c536-490b-a13c-2a09ea07a7aa\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.513645 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-combined-ca-bundle\") pod \"3ded2837-c536-490b-a13c-2a09ea07a7aa\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.514539 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-scripts\") pod \"3ded2837-c536-490b-a13c-2a09ea07a7aa\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.514583 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ded2837-c536-490b-a13c-2a09ea07a7aa-logs\") pod \"3ded2837-c536-490b-a13c-2a09ea07a7aa\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.514629 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz82t\" (UniqueName: \"kubernetes.io/projected/3ded2837-c536-490b-a13c-2a09ea07a7aa-kube-api-access-zz82t\") pod \"3ded2837-c536-490b-a13c-2a09ea07a7aa\" (UID: \"3ded2837-c536-490b-a13c-2a09ea07a7aa\") " Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.515491 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ded2837-c536-490b-a13c-2a09ea07a7aa-logs" (OuterVolumeSpecName: "logs") pod "3ded2837-c536-490b-a13c-2a09ea07a7aa" (UID: "3ded2837-c536-490b-a13c-2a09ea07a7aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.534037 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-scripts" (OuterVolumeSpecName: "scripts") pod "3ded2837-c536-490b-a13c-2a09ea07a7aa" (UID: "3ded2837-c536-490b-a13c-2a09ea07a7aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.534040 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ded2837-c536-490b-a13c-2a09ea07a7aa-kube-api-access-zz82t" (OuterVolumeSpecName: "kube-api-access-zz82t") pod "3ded2837-c536-490b-a13c-2a09ea07a7aa" (UID: "3ded2837-c536-490b-a13c-2a09ea07a7aa"). InnerVolumeSpecName "kube-api-access-zz82t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.542317 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ded2837-c536-490b-a13c-2a09ea07a7aa" (UID: "3ded2837-c536-490b-a13c-2a09ea07a7aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.556502 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-config-data" (OuterVolumeSpecName: "config-data") pod "3ded2837-c536-490b-a13c-2a09ea07a7aa" (UID: "3ded2837-c536-490b-a13c-2a09ea07a7aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.617183 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.617217 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.617235 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ded2837-c536-490b-a13c-2a09ea07a7aa-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.617246 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz82t\" (UniqueName: \"kubernetes.io/projected/3ded2837-c536-490b-a13c-2a09ea07a7aa-kube-api-access-zz82t\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.617258 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ded2837-c536-490b-a13c-2a09ea07a7aa-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.841200 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7q7vc" Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.842136 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7q7vc" event={"ID":"3ded2837-c536-490b-a13c-2a09ea07a7aa","Type":"ContainerDied","Data":"15fd3b4ce0fe54562d451590754f685f086e3116e19546500c2a9179022ab7d6"} Mar 20 16:21:53 crc kubenswrapper[4708]: I0320 16:21:53.842338 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15fd3b4ce0fe54562d451590754f685f086e3116e19546500c2a9179022ab7d6" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.540313 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-798d756d44-mhhzm"] Mar 20 16:21:54 crc kubenswrapper[4708]: E0320 16:21:54.541120 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ded2837-c536-490b-a13c-2a09ea07a7aa" containerName="placement-db-sync" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.541139 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ded2837-c536-490b-a13c-2a09ea07a7aa" containerName="placement-db-sync" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.541314 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ded2837-c536-490b-a13c-2a09ea07a7aa" containerName="placement-db-sync" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.542190 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.549841 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.550434 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2gwpt" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.551001 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.564207 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.564428 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.589927 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-798d756d44-mhhzm"] Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.634852 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-internal-tls-certs\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.634938 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06b07110-3c5e-476a-9ff2-a460d043afc4-logs\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.634969 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-scripts\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.634990 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-public-tls-certs\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.635028 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4m8z\" (UniqueName: \"kubernetes.io/projected/06b07110-3c5e-476a-9ff2-a460d043afc4-kube-api-access-s4m8z\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.635117 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-combined-ca-bundle\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.635146 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-config-data\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.736730 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-combined-ca-bundle\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.736788 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-config-data\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.736853 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-internal-tls-certs\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.736889 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06b07110-3c5e-476a-9ff2-a460d043afc4-logs\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.736913 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-scripts\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.736929 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-public-tls-certs\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.736959 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4m8z\" (UniqueName: \"kubernetes.io/projected/06b07110-3c5e-476a-9ff2-a460d043afc4-kube-api-access-s4m8z\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.740130 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06b07110-3c5e-476a-9ff2-a460d043afc4-logs\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.742481 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-combined-ca-bundle\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.759527 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-config-data\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.759590 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4m8z\" (UniqueName: \"kubernetes.io/projected/06b07110-3c5e-476a-9ff2-a460d043afc4-kube-api-access-s4m8z\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.765163 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-scripts\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.766548 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-internal-tls-certs\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.767871 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-public-tls-certs\") pod \"placement-798d756d44-mhhzm\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:54 crc kubenswrapper[4708]: I0320 16:21:54.882814 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.520918 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.547086 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.570818 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-fernet-keys\") pod \"341b59c3-684f-45e4-9d42-ed258e0e671b\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.570859 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q4k7\" (UniqueName: \"kubernetes.io/projected/341b59c3-684f-45e4-9d42-ed258e0e671b-kube-api-access-6q4k7\") pod \"341b59c3-684f-45e4-9d42-ed258e0e671b\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.570953 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-scripts\") pod \"341b59c3-684f-45e4-9d42-ed258e0e671b\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.571019 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phsjk\" (UniqueName: \"kubernetes.io/projected/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-kube-api-access-phsjk\") pod \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\" (UID: \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\") " Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.571085 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-combined-ca-bundle\") pod \"341b59c3-684f-45e4-9d42-ed258e0e671b\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.571130 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-credential-keys\") pod \"341b59c3-684f-45e4-9d42-ed258e0e671b\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.571201 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-config-data\") pod \"341b59c3-684f-45e4-9d42-ed258e0e671b\" (UID: \"341b59c3-684f-45e4-9d42-ed258e0e671b\") " Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.571218 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-combined-ca-bundle\") pod \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\" (UID: \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\") " Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.571252 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-config\") pod \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\" (UID: \"52e4d34b-0c95-475c-b9e5-be1dff27d5a3\") " Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.575886 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "341b59c3-684f-45e4-9d42-ed258e0e671b" (UID: "341b59c3-684f-45e4-9d42-ed258e0e671b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.578786 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "341b59c3-684f-45e4-9d42-ed258e0e671b" (UID: "341b59c3-684f-45e4-9d42-ed258e0e671b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.578827 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/341b59c3-684f-45e4-9d42-ed258e0e671b-kube-api-access-6q4k7" (OuterVolumeSpecName: "kube-api-access-6q4k7") pod "341b59c3-684f-45e4-9d42-ed258e0e671b" (UID: "341b59c3-684f-45e4-9d42-ed258e0e671b"). InnerVolumeSpecName "kube-api-access-6q4k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.579437 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-scripts" (OuterVolumeSpecName: "scripts") pod "341b59c3-684f-45e4-9d42-ed258e0e671b" (UID: "341b59c3-684f-45e4-9d42-ed258e0e671b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.580290 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-kube-api-access-phsjk" (OuterVolumeSpecName: "kube-api-access-phsjk") pod "52e4d34b-0c95-475c-b9e5-be1dff27d5a3" (UID: "52e4d34b-0c95-475c-b9e5-be1dff27d5a3"). InnerVolumeSpecName "kube-api-access-phsjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.617443 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52e4d34b-0c95-475c-b9e5-be1dff27d5a3" (UID: "52e4d34b-0c95-475c-b9e5-be1dff27d5a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.646714 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-config" (OuterVolumeSpecName: "config") pod "52e4d34b-0c95-475c-b9e5-be1dff27d5a3" (UID: "52e4d34b-0c95-475c-b9e5-be1dff27d5a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.659195 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "341b59c3-684f-45e4-9d42-ed258e0e671b" (UID: "341b59c3-684f-45e4-9d42-ed258e0e671b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.673932 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.673965 4708 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.673976 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.673988 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.673999 4708 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.674010 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q4k7\" (UniqueName: \"kubernetes.io/projected/341b59c3-684f-45e4-9d42-ed258e0e671b-kube-api-access-6q4k7\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.674021 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.674034 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phsjk\" (UniqueName: \"kubernetes.io/projected/52e4d34b-0c95-475c-b9e5-be1dff27d5a3-kube-api-access-phsjk\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.704218 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-config-data" (OuterVolumeSpecName: "config-data") pod "341b59c3-684f-45e4-9d42-ed258e0e671b" (UID: "341b59c3-684f-45e4-9d42-ed258e0e671b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.793325 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341b59c3-684f-45e4-9d42-ed258e0e671b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:56 crc kubenswrapper[4708]: W0320 16:21:56.855776 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06b07110_3c5e_476a_9ff2_a460d043afc4.slice/crio-05e153e621495e13c1164df458b65622618d75a9d3e0f4c763e1637e419babc5 WatchSource:0}: Error finding container 05e153e621495e13c1164df458b65622618d75a9d3e0f4c763e1637e419babc5: Status 404 returned error can't find the container with id 05e153e621495e13c1164df458b65622618d75a9d3e0f4c763e1637e419babc5 Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.855970 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-798d756d44-mhhzm"] Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.877528 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"3640d4efd5d98b77adf3aaa131f8b5c932e414f2049cb35282c4db2d8e62b4ce"} Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.880984 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-798d756d44-mhhzm" event={"ID":"06b07110-3c5e-476a-9ff2-a460d043afc4","Type":"ContainerStarted","Data":"05e153e621495e13c1164df458b65622618d75a9d3e0f4c763e1637e419babc5"} Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.883788 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5htg" event={"ID":"341b59c3-684f-45e4-9d42-ed258e0e671b","Type":"ContainerDied","Data":"625a79071f0232d857484ce3fb76703b76a28fb2388c0fd506d5fd532b79307e"} Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.884016 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="625a79071f0232d857484ce3fb76703b76a28fb2388c0fd506d5fd532b79307e" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.884105 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5htg" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.887654 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sr6vd" event={"ID":"52e4d34b-0c95-475c-b9e5-be1dff27d5a3","Type":"ContainerDied","Data":"3b77373fd4ca356a1459aa2e3706bfa9ae9fe7760173b67b5d9aebd29dc326dd"} Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.887713 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b77373fd4ca356a1459aa2e3706bfa9ae9fe7760173b67b5d9aebd29dc326dd" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.887782 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sr6vd" Mar 20 16:21:56 crc kubenswrapper[4708]: I0320 16:21:56.903689 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5121a54a-778a-4b46-9726-a4ba2901042b","Type":"ContainerStarted","Data":"baa67816926af97c6686cfc2905d6b9f42787cad9e04af7895c47d3fa120f290"} Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.700986 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7bd9698484-kk2kq"] Mar 20 16:21:57 crc kubenswrapper[4708]: E0320 16:21:57.701905 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="341b59c3-684f-45e4-9d42-ed258e0e671b" containerName="keystone-bootstrap" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.701918 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="341b59c3-684f-45e4-9d42-ed258e0e671b" containerName="keystone-bootstrap" Mar 20 16:21:57 crc kubenswrapper[4708]: E0320 16:21:57.701933 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e4d34b-0c95-475c-b9e5-be1dff27d5a3" containerName="neutron-db-sync" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.701940 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e4d34b-0c95-475c-b9e5-be1dff27d5a3" containerName="neutron-db-sync" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.702115 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e4d34b-0c95-475c-b9e5-be1dff27d5a3" containerName="neutron-db-sync" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.702125 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="341b59c3-684f-45e4-9d42-ed258e0e671b" containerName="keystone-bootstrap" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.703012 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.705254 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.705416 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.705538 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.710633 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.710657 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vtmgw" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.711085 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.734995 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bd9698484-kk2kq"] Mar 20 16:21:57 crc kubenswrapper[4708]: E0320 16:21:57.764222 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"account-server\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"account-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-account:current-podified\\\"\", failed to \"StartContainer\" for \"account-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-account:current-podified\\\"\", failed to \"StartContainer\" for \"account-reaper\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-account:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="b5b88259-a142-42a0-ab2c-bb0980ad9465" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.812347 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-combined-ca-bundle\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.812441 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-public-tls-certs\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.812493 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-config-data\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.812523 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-scripts\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.812573 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-internal-tls-certs\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.812606 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-fernet-keys\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.812662 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-credential-keys\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.812717 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82rpx\" (UniqueName: \"kubernetes.io/projected/c6ba411f-6368-44dd-a104-84c141bd9092-kube-api-access-82rpx\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.856386 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-lnqzp"] Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.856615 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" podUID="73f0ea77-b504-4e96-b463-f1b69dcefc6b" containerName="dnsmasq-dns" containerID="cri-o://0a9c68c23416d7c03f14ca198ef1c7e634267e31b639f9e72cc553c2f635fcc5" gracePeriod=10 Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.858899 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.916972 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-credential-keys\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.917323 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82rpx\" (UniqueName: \"kubernetes.io/projected/c6ba411f-6368-44dd-a104-84c141bd9092-kube-api-access-82rpx\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.917414 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-combined-ca-bundle\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.917555 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-public-tls-certs\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.917696 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-config-data\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.917827 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-scripts\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.918005 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-internal-tls-certs\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.918179 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-fernet-keys\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.927097 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-credential-keys\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.928089 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-fernet-keys\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.934197 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-combined-ca-bundle\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.936608 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735","Type":"ContainerStarted","Data":"034181e78dd49dbec62362055d85c31d6fdf4fd68c8f308d4580c8de5b3e1f9d"} Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.936774 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" containerName="glance-log" containerID="cri-o://13f8f5fe5a6eca2dbd0a42822bc6e8579ee26b3846c842eaa48eed01f89e9ec5" gracePeriod=30 Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.937431 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" containerName="glance-httpd" containerID="cri-o://034181e78dd49dbec62362055d85c31d6fdf4fd68c8f308d4580c8de5b3e1f9d" gracePeriod=30 Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.938982 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-scripts\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.947486 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-config-data\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.954894 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-public-tls-certs\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.959034 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6ba411f-6368-44dd-a104-84c141bd9092-internal-tls-certs\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:57 crc kubenswrapper[4708]: I0320 16:21:57.969312 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82rpx\" (UniqueName: \"kubernetes.io/projected/c6ba411f-6368-44dd-a104-84c141bd9092-kube-api-access-82rpx\") pod \"keystone-7bd9698484-kk2kq\" (UID: \"c6ba411f-6368-44dd-a104-84c141bd9092\") " pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.020111 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fb745b69-d6hdf"] Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.021823 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"32367beb9b6c190d39e83beb1c75901ef65ca28c429b653b72e948df99a5867e"} Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.021858 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"b71670178d98f60329407f76d178463b9b4007d96137ca4aa76022e3338e10fd"} Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.021872 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"fe11e816c640838eca546910e1fb598c8d03e76ac0e306831320dc0c316c276f"} Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.021883 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"ef586609e1ccea9def146b9a35b2d762d73824091c119522a132fec1d33834c7"} Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.021972 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.043851 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.043825185 podStartE2EDuration="10.043825185s" podCreationTimestamp="2026-03-20 16:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:58.001483186 +0000 UTC m=+1272.675819911" watchObservedRunningTime="2026-03-20 16:21:58.043825185 +0000 UTC m=+1272.718161900" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.046055 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-d6hdf"] Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.047346 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-798d756d44-mhhzm" event={"ID":"06b07110-3c5e-476a-9ff2-a460d043afc4","Type":"ContainerStarted","Data":"a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729"} Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.047392 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-798d756d44-mhhzm" event={"ID":"06b07110-3c5e-476a-9ff2-a460d043afc4","Type":"ContainerStarted","Data":"31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16"} Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.048104 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.048146 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.075728 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.088986 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff22ac59-4905-4ddf-975d-2047f67dc4be","Type":"ContainerStarted","Data":"69b660f4b2464d8c1455c298897ba44ea605e23834cfcc93bc1f3165ee57abc6"} Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.089152 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ff22ac59-4905-4ddf-975d-2047f67dc4be" containerName="glance-log" containerID="cri-o://fe02ac34b8493c1e2b6bfff4ab21a92688407369014185039b3f016745caea03" gracePeriod=30 Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.089265 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ff22ac59-4905-4ddf-975d-2047f67dc4be" containerName="glance-httpd" containerID="cri-o://69b660f4b2464d8c1455c298897ba44ea605e23834cfcc93bc1f3165ee57abc6" gracePeriod=30 Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.141797 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.141961 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-config\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.142130 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bltqv\" (UniqueName: \"kubernetes.io/projected/1c34b82d-b121-4aba-91fe-18da561344a1-kube-api-access-bltqv\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.142160 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-dns-svc\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.142208 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.248861 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.248985 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-config\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.249167 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bltqv\" (UniqueName: \"kubernetes.io/projected/1c34b82d-b121-4aba-91fe-18da561344a1-kube-api-access-bltqv\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.249202 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-dns-svc\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.249233 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.250242 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.250969 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.251612 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-config\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.252726 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-68db5b9d4d-q2n5k"] Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.254529 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.256469 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-dns-svc\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.259731 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68db5b9d4d-q2n5k"] Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.260224 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.260404 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wgwkm" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.260620 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.267866 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.267847788 podStartE2EDuration="11.267847788s" podCreationTimestamp="2026-03-20 16:21:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:58.225644672 +0000 UTC m=+1272.899981397" watchObservedRunningTime="2026-03-20 16:21:58.267847788 +0000 UTC m=+1272.942184503" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.271236 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.293851 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" podUID="73f0ea77-b504-4e96-b463-f1b69dcefc6b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: connect: connection refused" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.294636 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bltqv\" (UniqueName: \"kubernetes.io/projected/1c34b82d-b121-4aba-91fe-18da561344a1-kube-api-access-bltqv\") pod \"dnsmasq-dns-fb745b69-d6hdf\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.310832 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-798d756d44-mhhzm" podStartSLOduration=4.310804784 podStartE2EDuration="4.310804784s" podCreationTimestamp="2026-03-20 16:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:21:58.295822043 +0000 UTC m=+1272.970158778" watchObservedRunningTime="2026-03-20 16:21:58.310804784 +0000 UTC m=+1272.985141519" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.351338 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-config\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.352210 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-httpd-config\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.352272 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-combined-ca-bundle\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.352302 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpf7t\" (UniqueName: \"kubernetes.io/projected/bf812fb4-4e10-42ae-bc52-bc08b8749d29-kube-api-access-bpf7t\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.352361 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-ovndb-tls-certs\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.406585 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.454559 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-config\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.454617 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-httpd-config\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.454690 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-combined-ca-bundle\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.454712 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpf7t\" (UniqueName: \"kubernetes.io/projected/bf812fb4-4e10-42ae-bc52-bc08b8749d29-kube-api-access-bpf7t\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.454790 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-ovndb-tls-certs\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.466005 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-ovndb-tls-certs\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.466314 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-httpd-config\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.466808 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-combined-ca-bundle\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.471552 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-config\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.479647 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpf7t\" (UniqueName: \"kubernetes.io/projected/bf812fb4-4e10-42ae-bc52-bc08b8749d29-kube-api-access-bpf7t\") pod \"neutron-68db5b9d4d-q2n5k\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:58 crc kubenswrapper[4708]: I0320 16:21:58.612559 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.045709 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bd9698484-kk2kq"] Mar 20 16:21:59 crc kubenswrapper[4708]: W0320 16:21:59.065043 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6ba411f_6368_44dd_a104_84c141bd9092.slice/crio-1d48299820249e03f84729559c54567de5611a289f5941d083471588ede269ed WatchSource:0}: Error finding container 1d48299820249e03f84729559c54567de5611a289f5941d083471588ede269ed: Status 404 returned error can't find the container with id 1d48299820249e03f84729559c54567de5611a289f5941d083471588ede269ed Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.151851 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.197842 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zk9cx" event={"ID":"204070bf-f103-49d9-b366-185454e68b9e","Type":"ContainerStarted","Data":"89897c70acf5379db0001e6266077a0415262c310b34144e5e0a9c90860b2dd5"} Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.246022 4708 generic.go:334] "Generic (PLEG): container finished" podID="73f0ea77-b504-4e96-b463-f1b69dcefc6b" containerID="0a9c68c23416d7c03f14ca198ef1c7e634267e31b639f9e72cc553c2f635fcc5" exitCode=0 Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.246224 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.246225 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-lnqzp" event={"ID":"73f0ea77-b504-4e96-b463-f1b69dcefc6b","Type":"ContainerDied","Data":"0a9c68c23416d7c03f14ca198ef1c7e634267e31b639f9e72cc553c2f635fcc5"} Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.246297 4708 scope.go:117] "RemoveContainer" containerID="0a9c68c23416d7c03f14ca198ef1c7e634267e31b639f9e72cc553c2f635fcc5" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.289741 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltd4z\" (UniqueName: \"kubernetes.io/projected/73f0ea77-b504-4e96-b463-f1b69dcefc6b-kube-api-access-ltd4z\") pod \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.289871 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-ovsdbserver-nb\") pod \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.289997 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-dns-svc\") pod \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.290061 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-config\") pod \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.290176 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-ovsdbserver-sb\") pod \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\" (UID: \"73f0ea77-b504-4e96-b463-f1b69dcefc6b\") " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.291694 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-zk9cx" podStartSLOduration=3.141981646 podStartE2EDuration="45.291659866s" podCreationTimestamp="2026-03-20 16:21:14 +0000 UTC" firstStartedPulling="2026-03-20 16:21:15.896330675 +0000 UTC m=+1230.570667390" lastFinishedPulling="2026-03-20 16:21:58.046008895 +0000 UTC m=+1272.720345610" observedRunningTime="2026-03-20 16:21:59.222681228 +0000 UTC m=+1273.897017943" watchObservedRunningTime="2026-03-20 16:21:59.291659866 +0000 UTC m=+1273.965996571" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.311392 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f0ea77-b504-4e96-b463-f1b69dcefc6b-kube-api-access-ltd4z" (OuterVolumeSpecName: "kube-api-access-ltd4z") pod "73f0ea77-b504-4e96-b463-f1b69dcefc6b" (UID: "73f0ea77-b504-4e96-b463-f1b69dcefc6b"). InnerVolumeSpecName "kube-api-access-ltd4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.322189 4708 generic.go:334] "Generic (PLEG): container finished" podID="ff22ac59-4905-4ddf-975d-2047f67dc4be" containerID="69b660f4b2464d8c1455c298897ba44ea605e23834cfcc93bc1f3165ee57abc6" exitCode=0 Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.322223 4708 generic.go:334] "Generic (PLEG): container finished" podID="ff22ac59-4905-4ddf-975d-2047f67dc4be" containerID="fe02ac34b8493c1e2b6bfff4ab21a92688407369014185039b3f016745caea03" exitCode=143 Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.322329 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff22ac59-4905-4ddf-975d-2047f67dc4be","Type":"ContainerDied","Data":"69b660f4b2464d8c1455c298897ba44ea605e23834cfcc93bc1f3165ee57abc6"} Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.322356 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff22ac59-4905-4ddf-975d-2047f67dc4be","Type":"ContainerDied","Data":"fe02ac34b8493c1e2b6bfff4ab21a92688407369014185039b3f016745caea03"} Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.389176 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-d6hdf"] Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.389655 4708 generic.go:334] "Generic (PLEG): container finished" podID="9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" containerID="034181e78dd49dbec62362055d85c31d6fdf4fd68c8f308d4580c8de5b3e1f9d" exitCode=0 Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.389749 4708 generic.go:334] "Generic (PLEG): container finished" podID="9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" containerID="13f8f5fe5a6eca2dbd0a42822bc6e8579ee26b3846c842eaa48eed01f89e9ec5" exitCode=143 Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.389837 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735","Type":"ContainerDied","Data":"034181e78dd49dbec62362055d85c31d6fdf4fd68c8f308d4580c8de5b3e1f9d"} Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.389965 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735","Type":"ContainerDied","Data":"13f8f5fe5a6eca2dbd0a42822bc6e8579ee26b3846c842eaa48eed01f89e9ec5"} Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.396317 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltd4z\" (UniqueName: \"kubernetes.io/projected/73f0ea77-b504-4e96-b463-f1b69dcefc6b-kube-api-access-ltd4z\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.396976 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bd9698484-kk2kq" event={"ID":"c6ba411f-6368-44dd-a104-84c141bd9092","Type":"ContainerStarted","Data":"1d48299820249e03f84729559c54567de5611a289f5941d083471588ede269ed"} Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.411362 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "73f0ea77-b504-4e96-b463-f1b69dcefc6b" (UID: "73f0ea77-b504-4e96-b463-f1b69dcefc6b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.461421 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-config" (OuterVolumeSpecName: "config") pod "73f0ea77-b504-4e96-b463-f1b69dcefc6b" (UID: "73f0ea77-b504-4e96-b463-f1b69dcefc6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.507256 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.507538 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.515771 4708 scope.go:117] "RemoveContainer" containerID="c952ff4e08fd4025465d973eef1a8368a30c1a8187e9b0c465cb890ebc07041b" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.542109 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "73f0ea77-b504-4e96-b463-f1b69dcefc6b" (UID: "73f0ea77-b504-4e96-b463-f1b69dcefc6b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.618058 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.635990 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73f0ea77-b504-4e96-b463-f1b69dcefc6b" (UID: "73f0ea77-b504-4e96-b463-f1b69dcefc6b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.677763 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.720858 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73f0ea77-b504-4e96-b463-f1b69dcefc6b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.824284 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff22ac59-4905-4ddf-975d-2047f67dc4be-httpd-run\") pod \"ff22ac59-4905-4ddf-975d-2047f67dc4be\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.824735 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff22ac59-4905-4ddf-975d-2047f67dc4be-logs\") pod \"ff22ac59-4905-4ddf-975d-2047f67dc4be\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.824833 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-combined-ca-bundle\") pod \"ff22ac59-4905-4ddf-975d-2047f67dc4be\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.824893 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bglhl\" (UniqueName: \"kubernetes.io/projected/ff22ac59-4905-4ddf-975d-2047f67dc4be-kube-api-access-bglhl\") pod \"ff22ac59-4905-4ddf-975d-2047f67dc4be\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.824966 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-config-data\") pod \"ff22ac59-4905-4ddf-975d-2047f67dc4be\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.825006 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ff22ac59-4905-4ddf-975d-2047f67dc4be\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.825112 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-scripts\") pod \"ff22ac59-4905-4ddf-975d-2047f67dc4be\" (UID: \"ff22ac59-4905-4ddf-975d-2047f67dc4be\") " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.825481 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff22ac59-4905-4ddf-975d-2047f67dc4be-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ff22ac59-4905-4ddf-975d-2047f67dc4be" (UID: "ff22ac59-4905-4ddf-975d-2047f67dc4be"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.825795 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff22ac59-4905-4ddf-975d-2047f67dc4be-logs" (OuterVolumeSpecName: "logs") pod "ff22ac59-4905-4ddf-975d-2047f67dc4be" (UID: "ff22ac59-4905-4ddf-975d-2047f67dc4be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.846481 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff22ac59-4905-4ddf-975d-2047f67dc4be-kube-api-access-bglhl" (OuterVolumeSpecName: "kube-api-access-bglhl") pod "ff22ac59-4905-4ddf-975d-2047f67dc4be" (UID: "ff22ac59-4905-4ddf-975d-2047f67dc4be"). InnerVolumeSpecName "kube-api-access-bglhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.861437 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68db5b9d4d-q2n5k"] Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.880215 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-scripts" (OuterVolumeSpecName: "scripts") pod "ff22ac59-4905-4ddf-975d-2047f67dc4be" (UID: "ff22ac59-4905-4ddf-975d-2047f67dc4be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.880280 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "ff22ac59-4905-4ddf-975d-2047f67dc4be" (UID: "ff22ac59-4905-4ddf-975d-2047f67dc4be"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.927882 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bglhl\" (UniqueName: \"kubernetes.io/projected/ff22ac59-4905-4ddf-975d-2047f67dc4be-kube-api-access-bglhl\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.927933 4708 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.927944 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.927952 4708 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff22ac59-4905-4ddf-975d-2047f67dc4be-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.927961 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff22ac59-4905-4ddf-975d-2047f67dc4be-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:21:59 crc kubenswrapper[4708]: I0320 16:21:59.996929 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-lnqzp"] Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.020993 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-lnqzp"] Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.163384 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f0ea77-b504-4e96-b463-f1b69dcefc6b" path="/var/lib/kubelet/pods/73f0ea77-b504-4e96-b463-f1b69dcefc6b/volumes" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.165737 4708 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.186301 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567062-wp76k"] Mar 20 16:22:00 crc kubenswrapper[4708]: E0320 16:22:00.186730 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f0ea77-b504-4e96-b463-f1b69dcefc6b" containerName="init" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.186745 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f0ea77-b504-4e96-b463-f1b69dcefc6b" containerName="init" Mar 20 16:22:00 crc kubenswrapper[4708]: E0320 16:22:00.186762 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff22ac59-4905-4ddf-975d-2047f67dc4be" containerName="glance-log" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.186768 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff22ac59-4905-4ddf-975d-2047f67dc4be" containerName="glance-log" Mar 20 16:22:00 crc kubenswrapper[4708]: E0320 16:22:00.186785 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f0ea77-b504-4e96-b463-f1b69dcefc6b" containerName="dnsmasq-dns" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.186792 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f0ea77-b504-4e96-b463-f1b69dcefc6b" containerName="dnsmasq-dns" Mar 20 16:22:00 crc kubenswrapper[4708]: E0320 16:22:00.186805 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff22ac59-4905-4ddf-975d-2047f67dc4be" containerName="glance-httpd" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.186812 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff22ac59-4905-4ddf-975d-2047f67dc4be" containerName="glance-httpd" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.199307 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff22ac59-4905-4ddf-975d-2047f67dc4be" containerName="glance-log" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.199361 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff22ac59-4905-4ddf-975d-2047f67dc4be" containerName="glance-httpd" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.199384 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f0ea77-b504-4e96-b463-f1b69dcefc6b" containerName="dnsmasq-dns" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.203823 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-config-data" (OuterVolumeSpecName: "config-data") pod "ff22ac59-4905-4ddf-975d-2047f67dc4be" (UID: "ff22ac59-4905-4ddf-975d-2047f67dc4be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.228456 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-wp76k" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.236115 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff22ac59-4905-4ddf-975d-2047f67dc4be" (UID: "ff22ac59-4905-4ddf-975d-2047f67dc4be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.236382 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.236415 4708 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.236428 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff22ac59-4905-4ddf-975d-2047f67dc4be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.240493 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.242951 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.243139 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.269474 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-wp76k"] Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.337955 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68x2t\" (UniqueName: \"kubernetes.io/projected/c9260427-7f69-4872-9c37-912e6d1cd594-kube-api-access-68x2t\") pod \"auto-csr-approver-29567062-wp76k\" (UID: \"c9260427-7f69-4872-9c37-912e6d1cd594\") " pod="openshift-infra/auto-csr-approver-29567062-wp76k" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.414836 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68db5b9d4d-q2n5k" event={"ID":"bf812fb4-4e10-42ae-bc52-bc08b8749d29","Type":"ContainerStarted","Data":"39c308e50ed77143f2ef7df37ff841a983f0e60ff2d80e4a626d1017f21b878f"} Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.432700 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fgjlj" event={"ID":"c46a759f-98ab-495d-9cab-ba1f2fbbb112","Type":"ContainerStarted","Data":"b59ea8e7c252be39ac0d296ea25722705ba81ac9da0d2242f50c7bd36f05d703"} Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.435748 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff22ac59-4905-4ddf-975d-2047f67dc4be","Type":"ContainerDied","Data":"7a544d1fc6bf92656483462c57b17af491bc331140fbc212a5721218401205be"} Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.435897 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.435914 4708 scope.go:117] "RemoveContainer" containerID="69b660f4b2464d8c1455c298897ba44ea605e23834cfcc93bc1f3165ee57abc6" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.439173 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735","Type":"ContainerDied","Data":"e2549dffe577024ab4017ccb53a8a5c19a3a7bc6e4b6710f4756270787ff852c"} Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.439222 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2549dffe577024ab4017ccb53a8a5c19a3a7bc6e4b6710f4756270787ff852c" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.442829 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68x2t\" (UniqueName: \"kubernetes.io/projected/c9260427-7f69-4872-9c37-912e6d1cd594-kube-api-access-68x2t\") pod \"auto-csr-approver-29567062-wp76k\" (UID: \"c9260427-7f69-4872-9c37-912e6d1cd594\") " pod="openshift-infra/auto-csr-approver-29567062-wp76k" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.456168 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bd9698484-kk2kq" event={"ID":"c6ba411f-6368-44dd-a104-84c141bd9092","Type":"ContainerStarted","Data":"13236466279ec92fb025a04e7fea0b4e49e7b81026d9375b297aa3b2d2849b17"} Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.456222 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.461601 4708 generic.go:334] "Generic (PLEG): container finished" podID="1c34b82d-b121-4aba-91fe-18da561344a1" containerID="c47904057bf833961b8a63edac32d3dfa53174176ade4c6ac3d0cd406f3349e5" exitCode=0 Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.461708 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-d6hdf" event={"ID":"1c34b82d-b121-4aba-91fe-18da561344a1","Type":"ContainerDied","Data":"c47904057bf833961b8a63edac32d3dfa53174176ade4c6ac3d0cd406f3349e5"} Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.461735 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-d6hdf" event={"ID":"1c34b82d-b121-4aba-91fe-18da561344a1","Type":"ContainerStarted","Data":"7204ecd45cf3989a366fc7fb368b4587d20a1791dd5993cb69631a761694f049"} Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.471504 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68x2t\" (UniqueName: \"kubernetes.io/projected/c9260427-7f69-4872-9c37-912e6d1cd594-kube-api-access-68x2t\") pod \"auto-csr-approver-29567062-wp76k\" (UID: \"c9260427-7f69-4872-9c37-912e6d1cd594\") " pod="openshift-infra/auto-csr-approver-29567062-wp76k" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.476068 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fgjlj" podStartSLOduration=4.160492759 podStartE2EDuration="46.47604467s" podCreationTimestamp="2026-03-20 16:21:14 +0000 UTC" firstStartedPulling="2026-03-20 16:21:15.71426117 +0000 UTC m=+1230.388597885" lastFinishedPulling="2026-03-20 16:21:58.029813081 +0000 UTC m=+1272.704149796" observedRunningTime="2026-03-20 16:22:00.462710605 +0000 UTC m=+1275.137047320" watchObservedRunningTime="2026-03-20 16:22:00.47604467 +0000 UTC m=+1275.150381375" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.481020 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"97328b2a68998a9347ae5c5313eefca4d003ec8ad200f3843aed9854aa3fd3e3"} Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.481064 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"6c6b68cd6e64dba6b96a896d3eb7e77c1facbbe999496d488a14e79f1384cdc1"} Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.481076 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"46cd38927e074179154b4d1927a04224e19b7149c4b8ef9f92886f5b3733c27a"} Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.530222 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7bd9698484-kk2kq" podStartSLOduration=3.5301963020000002 podStartE2EDuration="3.530196302s" podCreationTimestamp="2026-03-20 16:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:00.513331371 +0000 UTC m=+1275.187668086" watchObservedRunningTime="2026-03-20 16:22:00.530196302 +0000 UTC m=+1275.204533017" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.657809 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-wp76k" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.658283 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.662182 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.669786 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.675405 4708 scope.go:117] "RemoveContainer" containerID="fe02ac34b8493c1e2b6bfff4ab21a92688407369014185039b3f016745caea03" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.715970 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:00 crc kubenswrapper[4708]: E0320 16:22:00.716486 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" containerName="glance-httpd" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.716502 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" containerName="glance-httpd" Mar 20 16:22:00 crc kubenswrapper[4708]: E0320 16:22:00.716535 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" containerName="glance-log" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.716547 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" containerName="glance-log" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.716824 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" containerName="glance-httpd" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.716859 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" containerName="glance-log" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.718060 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.725552 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.725827 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.753255 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.754530 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c6xp\" (UniqueName: \"kubernetes.io/projected/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-kube-api-access-4c6xp\") pod \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.754587 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-logs\") pod \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.754618 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-scripts\") pod \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.754783 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-config-data\") pod \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.754804 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.754827 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-combined-ca-bundle\") pod \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.754863 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-httpd-run\") pod \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\" (UID: \"9d4f8abd-eb29-4556-ae80-3cfd4a3e6735\") " Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.756270 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-logs" (OuterVolumeSpecName: "logs") pod "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" (UID: "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.760519 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" (UID: "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.761116 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" (UID: "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.772226 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-kube-api-access-4c6xp" (OuterVolumeSpecName: "kube-api-access-4c6xp") pod "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" (UID: "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735"). InnerVolumeSpecName "kube-api-access-4c6xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.853190 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9f694fd9c-lggtm"] Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.864398 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.868040 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.868285 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.872753 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9f694fd9c-lggtm"] Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.881411 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.881475 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.881585 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.881619 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.881764 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac8c72be-4723-43be-8537-072adeb1e924-logs\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.881755 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-scripts" (OuterVolumeSpecName: "scripts") pod "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" (UID: "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.881968 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.882374 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac8c72be-4723-43be-8537-072adeb1e924-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.882456 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxfst\" (UniqueName: \"kubernetes.io/projected/ac8c72be-4723-43be-8537-072adeb1e924-kube-api-access-xxfst\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.882633 4708 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.882648 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c6xp\" (UniqueName: \"kubernetes.io/projected/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-kube-api-access-4c6xp\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.882658 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.882687 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.882713 4708 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.900089 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" (UID: "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.948836 4708 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.985624 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-combined-ca-bundle\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.985696 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw4mp\" (UniqueName: \"kubernetes.io/projected/90f720ed-ded0-464e-9691-8f83c10700b0-kube-api-access-pw4mp\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.985739 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac8c72be-4723-43be-8537-072adeb1e924-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.985777 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-config\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.985803 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-httpd-config\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.985832 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxfst\" (UniqueName: \"kubernetes.io/projected/ac8c72be-4723-43be-8537-072adeb1e924-kube-api-access-xxfst\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.985863 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-public-tls-certs\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.985887 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-ovndb-tls-certs\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.985937 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.985967 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.986016 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.986038 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.986071 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac8c72be-4723-43be-8537-072adeb1e924-logs\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.986111 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.986150 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-internal-tls-certs\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.986221 4708 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.986238 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.987151 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac8c72be-4723-43be-8537-072adeb1e924-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.989471 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.994837 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.995132 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac8c72be-4723-43be-8537-072adeb1e924-logs\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:00 crc kubenswrapper[4708]: I0320 16:22:00.995707 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.008785 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.015045 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxfst\" (UniqueName: \"kubernetes.io/projected/ac8c72be-4723-43be-8537-072adeb1e924-kube-api-access-xxfst\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.020157 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.024633 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.026219 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-config-data" (OuterVolumeSpecName: "config-data") pod "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" (UID: "9d4f8abd-eb29-4556-ae80-3cfd4a3e6735"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.063275 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.087870 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-internal-tls-certs\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.087923 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-combined-ca-bundle\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.088408 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw4mp\" (UniqueName: \"kubernetes.io/projected/90f720ed-ded0-464e-9691-8f83c10700b0-kube-api-access-pw4mp\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.088453 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-config\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.088716 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-httpd-config\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.088758 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-public-tls-certs\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.088782 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-ovndb-tls-certs\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.089041 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.095350 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-internal-tls-certs\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.098851 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-httpd-config\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.099550 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-ovndb-tls-certs\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.099789 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-config\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.100283 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-public-tls-certs\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.100510 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f720ed-ded0-464e-9691-8f83c10700b0-combined-ca-bundle\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.111852 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw4mp\" (UniqueName: \"kubernetes.io/projected/90f720ed-ded0-464e-9691-8f83c10700b0-kube-api-access-pw4mp\") pod \"neutron-9f694fd9c-lggtm\" (UID: \"90f720ed-ded0-464e-9691-8f83c10700b0\") " pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.199147 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.382062 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-wp76k"] Mar 20 16:22:01 crc kubenswrapper[4708]: W0320 16:22:01.422370 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9260427_7f69_4872_9c37_912e6d1cd594.slice/crio-137fbeb8861af6ed5bf911952006a5fe66304f41f2a585d9dda2389b749c9e4c WatchSource:0}: Error finding container 137fbeb8861af6ed5bf911952006a5fe66304f41f2a585d9dda2389b749c9e4c: Status 404 returned error can't find the container with id 137fbeb8861af6ed5bf911952006a5fe66304f41f2a585d9dda2389b749c9e4c Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.492961 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-wp76k" event={"ID":"c9260427-7f69-4872-9c37-912e6d1cd594","Type":"ContainerStarted","Data":"137fbeb8861af6ed5bf911952006a5fe66304f41f2a585d9dda2389b749c9e4c"} Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.497398 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-d6hdf" event={"ID":"1c34b82d-b121-4aba-91fe-18da561344a1","Type":"ContainerStarted","Data":"c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928"} Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.497545 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.530285 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fb745b69-d6hdf" podStartSLOduration=4.53026698 podStartE2EDuration="4.53026698s" podCreationTimestamp="2026-03-20 16:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:01.522772555 +0000 UTC m=+1276.197109270" watchObservedRunningTime="2026-03-20 16:22:01.53026698 +0000 UTC m=+1276.204603695" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.539529 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b5b88259-a142-42a0-ab2c-bb0980ad9465","Type":"ContainerStarted","Data":"3c59989ef3a0503121915004a673384e8fbec2e187194e6528683c3c496b9628"} Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.544685 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68db5b9d4d-q2n5k" event={"ID":"bf812fb4-4e10-42ae-bc52-bc08b8749d29","Type":"ContainerStarted","Data":"d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad"} Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.545598 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.547274 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.584628 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.322925114 podStartE2EDuration="1m17.584610248s" podCreationTimestamp="2026-03-20 16:20:44 +0000 UTC" firstStartedPulling="2026-03-20 16:21:18.392383127 +0000 UTC m=+1233.066719842" lastFinishedPulling="2026-03-20 16:21:58.654068241 +0000 UTC m=+1273.328404976" observedRunningTime="2026-03-20 16:22:01.579098827 +0000 UTC m=+1276.253435552" watchObservedRunningTime="2026-03-20 16:22:01.584610248 +0000 UTC m=+1276.258946963" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.622958 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-68db5b9d4d-q2n5k" podStartSLOduration=3.622933367 podStartE2EDuration="3.622933367s" podCreationTimestamp="2026-03-20 16:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:01.606810116 +0000 UTC m=+1276.281146831" watchObservedRunningTime="2026-03-20 16:22:01.622933367 +0000 UTC m=+1276.297270082" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.666325 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.685612 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.692816 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.694867 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.700519 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.700924 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.722112 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.813880 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.814641 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3b0e-e329-43b5-937a-107b0bea8941-logs\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.814696 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.814723 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3b0e-e329-43b5-937a-107b0bea8941-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.814743 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.814781 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.814973 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.815019 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qhfv\" (UniqueName: \"kubernetes.io/projected/0d2a3b0e-e329-43b5-937a-107b0bea8941-kube-api-access-7qhfv\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.869560 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.917201 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.917329 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3b0e-e329-43b5-937a-107b0bea8941-logs\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.917350 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.917367 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3b0e-e329-43b5-937a-107b0bea8941-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.917387 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.917401 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.917511 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.917539 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qhfv\" (UniqueName: \"kubernetes.io/projected/0d2a3b0e-e329-43b5-937a-107b0bea8941-kube-api-access-7qhfv\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.920149 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3b0e-e329-43b5-937a-107b0bea8941-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.920448 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3b0e-e329-43b5-937a-107b0bea8941-logs\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.926344 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.933935 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.942430 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.948189 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.948412 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qhfv\" (UniqueName: \"kubernetes.io/projected/0d2a3b0e-e329-43b5-937a-107b0bea8941-kube-api-access-7qhfv\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.951091 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.963996 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-d6hdf"] Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.986999 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mp56n"] Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.988689 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:01 crc kubenswrapper[4708]: I0320 16:22:01.994113 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.021979 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.030008 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mp56n"] Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.083083 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.095562 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9f694fd9c-lggtm"] Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.125503 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-config\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.125546 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.125581 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb67j\" (UniqueName: \"kubernetes.io/projected/82f1136f-a931-4ae6-b313-d664f25fa111-kube-api-access-hb67j\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.125619 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.125658 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-dns-svc\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.125701 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.128001 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4f8abd-eb29-4556-ae80-3cfd4a3e6735" path="/var/lib/kubelet/pods/9d4f8abd-eb29-4556-ae80-3cfd4a3e6735/volumes" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.129066 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff22ac59-4905-4ddf-975d-2047f67dc4be" path="/var/lib/kubelet/pods/ff22ac59-4905-4ddf-975d-2047f67dc4be/volumes" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.230216 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-config\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.230654 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.230717 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb67j\" (UniqueName: \"kubernetes.io/projected/82f1136f-a931-4ae6-b313-d664f25fa111-kube-api-access-hb67j\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.230768 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.230812 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-dns-svc\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.230842 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.231431 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.231533 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-config\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.232023 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.232492 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-dns-svc\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.234989 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.368899 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb67j\" (UniqueName: \"kubernetes.io/projected/82f1136f-a931-4ae6-b313-d664f25fa111-kube-api-access-hb67j\") pod \"dnsmasq-dns-55f844cf75-mp56n\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.572822 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac8c72be-4723-43be-8537-072adeb1e924","Type":"ContainerStarted","Data":"075415dd382ea42123ae040feaa93419172162920624edf385aa9e09628d8118"} Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.577652 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9f694fd9c-lggtm" event={"ID":"90f720ed-ded0-464e-9691-8f83c10700b0","Type":"ContainerStarted","Data":"d1ea592c11796be21eeeee901d5d9ea5c9b35c0f14e6221f82927d26c4e54c0c"} Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.583518 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68db5b9d4d-q2n5k" event={"ID":"bf812fb4-4e10-42ae-bc52-bc08b8749d29","Type":"ContainerStarted","Data":"fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c"} Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.645651 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.775959 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6b6ff5cbbd-kjfxp" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.870306 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7bc9dd67b8-mz4lv" podUID="15901de5-ddbe-4c7b-8968-8c614619be4d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 20 16:22:02 crc kubenswrapper[4708]: I0320 16:22:02.916637 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:02 crc kubenswrapper[4708]: W0320 16:22:02.936310 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d2a3b0e_e329_43b5_937a_107b0bea8941.slice/crio-7c30406650e08d2764e1990ec20d4d84d89333a4fa6b608b8064ab4c0d12493b WatchSource:0}: Error finding container 7c30406650e08d2764e1990ec20d4d84d89333a4fa6b608b8064ab4c0d12493b: Status 404 returned error can't find the container with id 7c30406650e08d2764e1990ec20d4d84d89333a4fa6b608b8064ab4c0d12493b Mar 20 16:22:03 crc kubenswrapper[4708]: I0320 16:22:03.381025 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mp56n"] Mar 20 16:22:03 crc kubenswrapper[4708]: I0320 16:22:03.632143 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" event={"ID":"82f1136f-a931-4ae6-b313-d664f25fa111","Type":"ContainerStarted","Data":"00e556064c2ed55fcc707c37cc6e9cb93ec7a87d9013cd6d05322a66e0142a6a"} Mar 20 16:22:03 crc kubenswrapper[4708]: I0320 16:22:03.634273 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d2a3b0e-e329-43b5-937a-107b0bea8941","Type":"ContainerStarted","Data":"7c30406650e08d2764e1990ec20d4d84d89333a4fa6b608b8064ab4c0d12493b"} Mar 20 16:22:03 crc kubenswrapper[4708]: I0320 16:22:03.637450 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac8c72be-4723-43be-8537-072adeb1e924","Type":"ContainerStarted","Data":"81ccf63c44b61beb53e56f9b9bbb79ac7b584bc9050d2b8ea6e95ac0d0995df8"} Mar 20 16:22:03 crc kubenswrapper[4708]: I0320 16:22:03.644166 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-wp76k" event={"ID":"c9260427-7f69-4872-9c37-912e6d1cd594","Type":"ContainerStarted","Data":"380c1ad3e5e7a07edf393f0538a366f39f35f8414874b9535bc2a3c535d831cf"} Mar 20 16:22:03 crc kubenswrapper[4708]: I0320 16:22:03.649557 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9f694fd9c-lggtm" event={"ID":"90f720ed-ded0-464e-9691-8f83c10700b0","Type":"ContainerStarted","Data":"5a5c887f2ab8b15684a9de2f23d9ed3bfec4f1bcfd83646e67dcad3d111cb143"} Mar 20 16:22:03 crc kubenswrapper[4708]: I0320 16:22:03.649613 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9f694fd9c-lggtm" event={"ID":"90f720ed-ded0-464e-9691-8f83c10700b0","Type":"ContainerStarted","Data":"0834ba530958cb5a212d45683f848e23dc2aa545cf7fbecc6220f947dc76c97a"} Mar 20 16:22:03 crc kubenswrapper[4708]: I0320 16:22:03.649755 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:03 crc kubenswrapper[4708]: I0320 16:22:03.649884 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fb745b69-d6hdf" podUID="1c34b82d-b121-4aba-91fe-18da561344a1" containerName="dnsmasq-dns" containerID="cri-o://c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928" gracePeriod=10 Mar 20 16:22:03 crc kubenswrapper[4708]: I0320 16:22:03.664870 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567062-wp76k" podStartSLOduration=2.521358283 podStartE2EDuration="3.664853138s" podCreationTimestamp="2026-03-20 16:22:00 +0000 UTC" firstStartedPulling="2026-03-20 16:22:01.425091971 +0000 UTC m=+1276.099428686" lastFinishedPulling="2026-03-20 16:22:02.568586826 +0000 UTC m=+1277.242923541" observedRunningTime="2026-03-20 16:22:03.662876713 +0000 UTC m=+1278.337213428" watchObservedRunningTime="2026-03-20 16:22:03.664853138 +0000 UTC m=+1278.339189853" Mar 20 16:22:03 crc kubenswrapper[4708]: I0320 16:22:03.692341 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9f694fd9c-lggtm" podStartSLOduration=3.692320619 podStartE2EDuration="3.692320619s" podCreationTimestamp="2026-03-20 16:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:03.686983834 +0000 UTC m=+1278.361320569" watchObservedRunningTime="2026-03-20 16:22:03.692320619 +0000 UTC m=+1278.366657344" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.162833 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.177241 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-ovsdbserver-nb\") pod \"1c34b82d-b121-4aba-91fe-18da561344a1\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.178183 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-config\") pod \"1c34b82d-b121-4aba-91fe-18da561344a1\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.178224 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-ovsdbserver-sb\") pod \"1c34b82d-b121-4aba-91fe-18da561344a1\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.178301 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bltqv\" (UniqueName: \"kubernetes.io/projected/1c34b82d-b121-4aba-91fe-18da561344a1-kube-api-access-bltqv\") pod \"1c34b82d-b121-4aba-91fe-18da561344a1\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.178382 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-dns-svc\") pod \"1c34b82d-b121-4aba-91fe-18da561344a1\" (UID: \"1c34b82d-b121-4aba-91fe-18da561344a1\") " Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.188074 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c34b82d-b121-4aba-91fe-18da561344a1-kube-api-access-bltqv" (OuterVolumeSpecName: "kube-api-access-bltqv") pod "1c34b82d-b121-4aba-91fe-18da561344a1" (UID: "1c34b82d-b121-4aba-91fe-18da561344a1"). InnerVolumeSpecName "kube-api-access-bltqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.257820 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-config" (OuterVolumeSpecName: "config") pod "1c34b82d-b121-4aba-91fe-18da561344a1" (UID: "1c34b82d-b121-4aba-91fe-18da561344a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.261755 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1c34b82d-b121-4aba-91fe-18da561344a1" (UID: "1c34b82d-b121-4aba-91fe-18da561344a1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.267537 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1c34b82d-b121-4aba-91fe-18da561344a1" (UID: "1c34b82d-b121-4aba-91fe-18da561344a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.282123 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.282164 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.282199 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.282213 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bltqv\" (UniqueName: \"kubernetes.io/projected/1c34b82d-b121-4aba-91fe-18da561344a1-kube-api-access-bltqv\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.287316 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c34b82d-b121-4aba-91fe-18da561344a1" (UID: "1c34b82d-b121-4aba-91fe-18da561344a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.385008 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c34b82d-b121-4aba-91fe-18da561344a1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.686154 4708 generic.go:334] "Generic (PLEG): container finished" podID="82f1136f-a931-4ae6-b313-d664f25fa111" containerID="cc9dcc857a0d2b0a5f16640ffd813552a6aded4a4e14d75f3e39718d6ddb7d19" exitCode=0 Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.686400 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" event={"ID":"82f1136f-a931-4ae6-b313-d664f25fa111","Type":"ContainerDied","Data":"cc9dcc857a0d2b0a5f16640ffd813552a6aded4a4e14d75f3e39718d6ddb7d19"} Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.691104 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d2a3b0e-e329-43b5-937a-107b0bea8941","Type":"ContainerStarted","Data":"fe4ed75fcf01f154ccd411b7819fcfe28253b3198ad4de5b3b4d33a84578e814"} Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.699039 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac8c72be-4723-43be-8537-072adeb1e924","Type":"ContainerStarted","Data":"e2c3b178ed0e541d253f458a7688daacddfaec3e1b2fcd36a923a2e80a254119"} Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.706523 4708 generic.go:334] "Generic (PLEG): container finished" podID="c9260427-7f69-4872-9c37-912e6d1cd594" containerID="380c1ad3e5e7a07edf393f0538a366f39f35f8414874b9535bc2a3c535d831cf" exitCode=0 Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.706582 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-wp76k" event={"ID":"c9260427-7f69-4872-9c37-912e6d1cd594","Type":"ContainerDied","Data":"380c1ad3e5e7a07edf393f0538a366f39f35f8414874b9535bc2a3c535d831cf"} Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.708533 4708 generic.go:334] "Generic (PLEG): container finished" podID="1c34b82d-b121-4aba-91fe-18da561344a1" containerID="c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928" exitCode=0 Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.709201 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-d6hdf" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.709773 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-d6hdf" event={"ID":"1c34b82d-b121-4aba-91fe-18da561344a1","Type":"ContainerDied","Data":"c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928"} Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.710062 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-d6hdf" event={"ID":"1c34b82d-b121-4aba-91fe-18da561344a1","Type":"ContainerDied","Data":"7204ecd45cf3989a366fc7fb368b4587d20a1791dd5993cb69631a761694f049"} Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.710080 4708 scope.go:117] "RemoveContainer" containerID="c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.789534 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.789518367 podStartE2EDuration="4.789518367s" podCreationTimestamp="2026-03-20 16:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:04.787043379 +0000 UTC m=+1279.461380104" watchObservedRunningTime="2026-03-20 16:22:04.789518367 +0000 UTC m=+1279.463855082" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.867092 4708 scope.go:117] "RemoveContainer" containerID="c47904057bf833961b8a63edac32d3dfa53174176ade4c6ac3d0cd406f3349e5" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.873051 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-d6hdf"] Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.887036 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-d6hdf"] Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.957863 4708 scope.go:117] "RemoveContainer" containerID="c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928" Mar 20 16:22:04 crc kubenswrapper[4708]: E0320 16:22:04.961832 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928\": container with ID starting with c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928 not found: ID does not exist" containerID="c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.961894 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928"} err="failed to get container status \"c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928\": rpc error: code = NotFound desc = could not find container \"c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928\": container with ID starting with c4cc2b82c1e677d527ba57f80b22713ef65b856327d22d6b7cf1082e9fb67928 not found: ID does not exist" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.961928 4708 scope.go:117] "RemoveContainer" containerID="c47904057bf833961b8a63edac32d3dfa53174176ade4c6ac3d0cd406f3349e5" Mar 20 16:22:04 crc kubenswrapper[4708]: E0320 16:22:04.967849 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c47904057bf833961b8a63edac32d3dfa53174176ade4c6ac3d0cd406f3349e5\": container with ID starting with c47904057bf833961b8a63edac32d3dfa53174176ade4c6ac3d0cd406f3349e5 not found: ID does not exist" containerID="c47904057bf833961b8a63edac32d3dfa53174176ade4c6ac3d0cd406f3349e5" Mar 20 16:22:04 crc kubenswrapper[4708]: I0320 16:22:04.967893 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c47904057bf833961b8a63edac32d3dfa53174176ade4c6ac3d0cd406f3349e5"} err="failed to get container status \"c47904057bf833961b8a63edac32d3dfa53174176ade4c6ac3d0cd406f3349e5\": rpc error: code = NotFound desc = could not find container \"c47904057bf833961b8a63edac32d3dfa53174176ade4c6ac3d0cd406f3349e5\": container with ID starting with c47904057bf833961b8a63edac32d3dfa53174176ade4c6ac3d0cd406f3349e5 not found: ID does not exist" Mar 20 16:22:05 crc kubenswrapper[4708]: I0320 16:22:05.722232 4708 generic.go:334] "Generic (PLEG): container finished" podID="204070bf-f103-49d9-b366-185454e68b9e" containerID="89897c70acf5379db0001e6266077a0415262c310b34144e5e0a9c90860b2dd5" exitCode=0 Mar 20 16:22:05 crc kubenswrapper[4708]: I0320 16:22:05.722348 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zk9cx" event={"ID":"204070bf-f103-49d9-b366-185454e68b9e","Type":"ContainerDied","Data":"89897c70acf5379db0001e6266077a0415262c310b34144e5e0a9c90860b2dd5"} Mar 20 16:22:05 crc kubenswrapper[4708]: I0320 16:22:05.726298 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" event={"ID":"82f1136f-a931-4ae6-b313-d664f25fa111","Type":"ContainerStarted","Data":"48e5cef8f7b01e7879b6c288aa56c0804c1c3dfd8377784fcd8c242042ea3282"} Mar 20 16:22:05 crc kubenswrapper[4708]: I0320 16:22:05.726418 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:05 crc kubenswrapper[4708]: I0320 16:22:05.731863 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d2a3b0e-e329-43b5-937a-107b0bea8941","Type":"ContainerStarted","Data":"a966cb1ff98f8190c957d01cf1bd095dd553069e7a18c2b3287c6bb6a471aaaf"} Mar 20 16:22:05 crc kubenswrapper[4708]: I0320 16:22:05.773660 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.773635768 podStartE2EDuration="4.773635768s" podCreationTimestamp="2026-03-20 16:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:05.764389575 +0000 UTC m=+1280.438726290" watchObservedRunningTime="2026-03-20 16:22:05.773635768 +0000 UTC m=+1280.447972503" Mar 20 16:22:05 crc kubenswrapper[4708]: I0320 16:22:05.795594 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" podStartSLOduration=4.795571079 podStartE2EDuration="4.795571079s" podCreationTimestamp="2026-03-20 16:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:05.786415158 +0000 UTC m=+1280.460751873" watchObservedRunningTime="2026-03-20 16:22:05.795571079 +0000 UTC m=+1280.469907794" Mar 20 16:22:06 crc kubenswrapper[4708]: I0320 16:22:06.132386 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c34b82d-b121-4aba-91fe-18da561344a1" path="/var/lib/kubelet/pods/1c34b82d-b121-4aba-91fe-18da561344a1/volumes" Mar 20 16:22:08 crc kubenswrapper[4708]: I0320 16:22:08.776706 4708 generic.go:334] "Generic (PLEG): container finished" podID="c46a759f-98ab-495d-9cab-ba1f2fbbb112" containerID="b59ea8e7c252be39ac0d296ea25722705ba81ac9da0d2242f50c7bd36f05d703" exitCode=0 Mar 20 16:22:08 crc kubenswrapper[4708]: I0320 16:22:08.776776 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fgjlj" event={"ID":"c46a759f-98ab-495d-9cab-ba1f2fbbb112","Type":"ContainerDied","Data":"b59ea8e7c252be39ac0d296ea25722705ba81ac9da0d2242f50c7bd36f05d703"} Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.139894 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.159508 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-wp76k" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.207564 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204070bf-f103-49d9-b366-185454e68b9e-combined-ca-bundle\") pod \"204070bf-f103-49d9-b366-185454e68b9e\" (UID: \"204070bf-f103-49d9-b366-185454e68b9e\") " Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.207768 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brxlv\" (UniqueName: \"kubernetes.io/projected/204070bf-f103-49d9-b366-185454e68b9e-kube-api-access-brxlv\") pod \"204070bf-f103-49d9-b366-185454e68b9e\" (UID: \"204070bf-f103-49d9-b366-185454e68b9e\") " Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.207894 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/204070bf-f103-49d9-b366-185454e68b9e-db-sync-config-data\") pod \"204070bf-f103-49d9-b366-185454e68b9e\" (UID: \"204070bf-f103-49d9-b366-185454e68b9e\") " Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.225865 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/204070bf-f103-49d9-b366-185454e68b9e-kube-api-access-brxlv" (OuterVolumeSpecName: "kube-api-access-brxlv") pod "204070bf-f103-49d9-b366-185454e68b9e" (UID: "204070bf-f103-49d9-b366-185454e68b9e"). InnerVolumeSpecName "kube-api-access-brxlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.226093 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204070bf-f103-49d9-b366-185454e68b9e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "204070bf-f103-49d9-b366-185454e68b9e" (UID: "204070bf-f103-49d9-b366-185454e68b9e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.247227 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204070bf-f103-49d9-b366-185454e68b9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "204070bf-f103-49d9-b366-185454e68b9e" (UID: "204070bf-f103-49d9-b366-185454e68b9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.310354 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68x2t\" (UniqueName: \"kubernetes.io/projected/c9260427-7f69-4872-9c37-912e6d1cd594-kube-api-access-68x2t\") pod \"c9260427-7f69-4872-9c37-912e6d1cd594\" (UID: \"c9260427-7f69-4872-9c37-912e6d1cd594\") " Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.310800 4708 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/204070bf-f103-49d9-b366-185454e68b9e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.310818 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204070bf-f103-49d9-b366-185454e68b9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.310828 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brxlv\" (UniqueName: \"kubernetes.io/projected/204070bf-f103-49d9-b366-185454e68b9e-kube-api-access-brxlv\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.314156 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9260427-7f69-4872-9c37-912e6d1cd594-kube-api-access-68x2t" (OuterVolumeSpecName: "kube-api-access-68x2t") pod "c9260427-7f69-4872-9c37-912e6d1cd594" (UID: "c9260427-7f69-4872-9c37-912e6d1cd594"). InnerVolumeSpecName "kube-api-access-68x2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.412364 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68x2t\" (UniqueName: \"kubernetes.io/projected/c9260427-7f69-4872-9c37-912e6d1cd594-kube-api-access-68x2t\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.789331 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-wp76k" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.789315 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-wp76k" event={"ID":"c9260427-7f69-4872-9c37-912e6d1cd594","Type":"ContainerDied","Data":"137fbeb8861af6ed5bf911952006a5fe66304f41f2a585d9dda2389b749c9e4c"} Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.789446 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="137fbeb8861af6ed5bf911952006a5fe66304f41f2a585d9dda2389b749c9e4c" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.795973 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zk9cx" Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.798642 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zk9cx" event={"ID":"204070bf-f103-49d9-b366-185454e68b9e","Type":"ContainerDied","Data":"2bc5dab68aa1bb97b34e64d6a0cbbdae5d66c01c39fdc16d14813532c15e43f3"} Mar 20 16:22:09 crc kubenswrapper[4708]: I0320 16:22:09.798720 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bc5dab68aa1bb97b34e64d6a0cbbdae5d66c01c39fdc16d14813532c15e43f3" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.237198 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-rbg8m"] Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.245456 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-rbg8m"] Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.414639 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg"] Mar 20 16:22:10 crc kubenswrapper[4708]: E0320 16:22:10.415085 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204070bf-f103-49d9-b366-185454e68b9e" containerName="barbican-db-sync" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.415102 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="204070bf-f103-49d9-b366-185454e68b9e" containerName="barbican-db-sync" Mar 20 16:22:10 crc kubenswrapper[4708]: E0320 16:22:10.415115 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c34b82d-b121-4aba-91fe-18da561344a1" containerName="init" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.415122 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c34b82d-b121-4aba-91fe-18da561344a1" containerName="init" Mar 20 16:22:10 crc kubenswrapper[4708]: E0320 16:22:10.415137 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9260427-7f69-4872-9c37-912e6d1cd594" containerName="oc" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.415143 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9260427-7f69-4872-9c37-912e6d1cd594" containerName="oc" Mar 20 16:22:10 crc kubenswrapper[4708]: E0320 16:22:10.415158 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c34b82d-b121-4aba-91fe-18da561344a1" containerName="dnsmasq-dns" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.415164 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c34b82d-b121-4aba-91fe-18da561344a1" containerName="dnsmasq-dns" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.415360 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="204070bf-f103-49d9-b366-185454e68b9e" containerName="barbican-db-sync" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.415386 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9260427-7f69-4872-9c37-912e6d1cd594" containerName="oc" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.415397 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c34b82d-b121-4aba-91fe-18da561344a1" containerName="dnsmasq-dns" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.416359 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.419856 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.420152 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-g9d6t" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.424534 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.448799 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7cf76bb97c-b4rrf"] Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.461078 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.470567 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg"] Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.470869 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.536354 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8868c62f-1325-4541-96b6-57a48f5b045e-config-data-custom\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.536427 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-config-data\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.536474 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cvg8\" (UniqueName: \"kubernetes.io/projected/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-kube-api-access-5cvg8\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.536537 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-config-data-custom\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.536612 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-combined-ca-bundle\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.536647 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8868c62f-1325-4541-96b6-57a48f5b045e-config-data\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.536717 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8868c62f-1325-4541-96b6-57a48f5b045e-logs\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.536775 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-logs\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.551765 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cf76bb97c-b4rrf"] Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.551920 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8868c62f-1325-4541-96b6-57a48f5b045e-combined-ca-bundle\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.552100 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bftr4\" (UniqueName: \"kubernetes.io/projected/8868c62f-1325-4541-96b6-57a48f5b045e-kube-api-access-bftr4\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.635446 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mp56n"] Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.635750 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" podUID="82f1136f-a931-4ae6-b313-d664f25fa111" containerName="dnsmasq-dns" containerID="cri-o://48e5cef8f7b01e7879b6c288aa56c0804c1c3dfd8377784fcd8c242042ea3282" gracePeriod=10 Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.638440 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.653875 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-logs\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.653946 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8868c62f-1325-4541-96b6-57a48f5b045e-combined-ca-bundle\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.654013 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bftr4\" (UniqueName: \"kubernetes.io/projected/8868c62f-1325-4541-96b6-57a48f5b045e-kube-api-access-bftr4\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.654075 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8868c62f-1325-4541-96b6-57a48f5b045e-config-data-custom\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.654107 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-config-data\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.654134 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cvg8\" (UniqueName: \"kubernetes.io/projected/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-kube-api-access-5cvg8\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.654187 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-config-data-custom\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.654250 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-combined-ca-bundle\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.654274 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8868c62f-1325-4541-96b6-57a48f5b045e-config-data\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.654294 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8868c62f-1325-4541-96b6-57a48f5b045e-logs\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.654328 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-logs\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.665525 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8868c62f-1325-4541-96b6-57a48f5b045e-logs\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.666290 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-combined-ca-bundle\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.670826 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-config-data-custom\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.670911 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-89cpk"] Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.680528 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.680859 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8868c62f-1325-4541-96b6-57a48f5b045e-combined-ca-bundle\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.684407 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8868c62f-1325-4541-96b6-57a48f5b045e-config-data-custom\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.693195 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cvg8\" (UniqueName: \"kubernetes.io/projected/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-kube-api-access-5cvg8\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.702999 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9-config-data\") pod \"barbican-worker-7cf76bb97c-b4rrf\" (UID: \"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9\") " pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.708308 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8868c62f-1325-4541-96b6-57a48f5b045e-config-data\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.714858 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-89cpk"] Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.716559 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bftr4\" (UniqueName: \"kubernetes.io/projected/8868c62f-1325-4541-96b6-57a48f5b045e-kube-api-access-bftr4\") pod \"barbican-keystone-listener-5f5dc8d9b6-dv4pg\" (UID: \"8868c62f-1325-4541-96b6-57a48f5b045e\") " pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.731276 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6cdb79bd7b-v5pg5"] Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.743197 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.745868 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.747435 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.757303 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-dns-svc\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.757384 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-config\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.757479 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.757501 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.757534 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.757566 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xbbz\" (UniqueName: \"kubernetes.io/projected/a8c78399-bd25-45db-8b4e-57dcda1870c3-kube-api-access-8xbbz\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.805544 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cf76bb97c-b4rrf" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.812457 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cdb79bd7b-v5pg5"] Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.860549 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-config-data\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.860602 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-combined-ca-bundle\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.860652 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-dns-svc\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.860743 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-config\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.860821 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7ab775f-a2d7-43de-9f9d-356a6a86f930-logs\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.860849 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.860875 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.860919 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.860956 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xbbz\" (UniqueName: \"kubernetes.io/projected/a8c78399-bd25-45db-8b4e-57dcda1870c3-kube-api-access-8xbbz\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.860992 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-config-data-custom\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.861017 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq58f\" (UniqueName: \"kubernetes.io/projected/f7ab775f-a2d7-43de-9f9d-356a6a86f930-kube-api-access-mq58f\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.862080 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-dns-svc\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.863164 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-config\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.863838 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.872240 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.872632 4708 generic.go:334] "Generic (PLEG): container finished" podID="82f1136f-a931-4ae6-b313-d664f25fa111" containerID="48e5cef8f7b01e7879b6c288aa56c0804c1c3dfd8377784fcd8c242042ea3282" exitCode=0 Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.872732 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" event={"ID":"82f1136f-a931-4ae6-b313-d664f25fa111","Type":"ContainerDied","Data":"48e5cef8f7b01e7879b6c288aa56c0804c1c3dfd8377784fcd8c242042ea3282"} Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.872924 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.884794 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xbbz\" (UniqueName: \"kubernetes.io/projected/a8c78399-bd25-45db-8b4e-57dcda1870c3-kube-api-access-8xbbz\") pod \"dnsmasq-dns-85ff748b95-89cpk\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.886900 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fgjlj" event={"ID":"c46a759f-98ab-495d-9cab-ba1f2fbbb112","Type":"ContainerDied","Data":"0aba4e1c947c8a0b7300103b6779e39a983f76d5367b8563b8d8e64b750837f3"} Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.886960 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aba4e1c947c8a0b7300103b6779e39a983f76d5367b8563b8d8e64b750837f3" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.948777 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.962946 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7ab775f-a2d7-43de-9f9d-356a6a86f930-logs\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.963063 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-config-data-custom\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.963088 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq58f\" (UniqueName: \"kubernetes.io/projected/f7ab775f-a2d7-43de-9f9d-356a6a86f930-kube-api-access-mq58f\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.963125 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-config-data\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.963144 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-combined-ca-bundle\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.972755 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7ab775f-a2d7-43de-9f9d-356a6a86f930-logs\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.973636 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-combined-ca-bundle\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.981045 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-config-data-custom\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.981482 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.990596 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-config-data\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:10 crc kubenswrapper[4708]: I0320 16:22:10.997272 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq58f\" (UniqueName: \"kubernetes.io/projected/f7ab775f-a2d7-43de-9f9d-356a6a86f930-kube-api-access-mq58f\") pod \"barbican-api-6cdb79bd7b-v5pg5\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.006860 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.064058 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-combined-ca-bundle\") pod \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.064184 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzb9q\" (UniqueName: \"kubernetes.io/projected/c46a759f-98ab-495d-9cab-ba1f2fbbb112-kube-api-access-nzb9q\") pod \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.064254 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-scripts\") pod \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.064287 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-config-data\") pod \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.064322 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-db-sync-config-data\") pod \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.064435 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c46a759f-98ab-495d-9cab-ba1f2fbbb112-etc-machine-id\") pod \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\" (UID: \"c46a759f-98ab-495d-9cab-ba1f2fbbb112\") " Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.066264 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c46a759f-98ab-495d-9cab-ba1f2fbbb112-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c46a759f-98ab-495d-9cab-ba1f2fbbb112" (UID: "c46a759f-98ab-495d-9cab-ba1f2fbbb112"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.070569 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.070614 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.091319 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-scripts" (OuterVolumeSpecName: "scripts") pod "c46a759f-98ab-495d-9cab-ba1f2fbbb112" (UID: "c46a759f-98ab-495d-9cab-ba1f2fbbb112"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.098180 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c46a759f-98ab-495d-9cab-ba1f2fbbb112" (UID: "c46a759f-98ab-495d-9cab-ba1f2fbbb112"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.101948 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c46a759f-98ab-495d-9cab-ba1f2fbbb112-kube-api-access-nzb9q" (OuterVolumeSpecName: "kube-api-access-nzb9q") pod "c46a759f-98ab-495d-9cab-ba1f2fbbb112" (UID: "c46a759f-98ab-495d-9cab-ba1f2fbbb112"). InnerVolumeSpecName "kube-api-access-nzb9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.132790 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c46a759f-98ab-495d-9cab-ba1f2fbbb112" (UID: "c46a759f-98ab-495d-9cab-ba1f2fbbb112"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.166319 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.167486 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.167523 4708 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.167540 4708 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c46a759f-98ab-495d-9cab-ba1f2fbbb112-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.167554 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.167567 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzb9q\" (UniqueName: \"kubernetes.io/projected/c46a759f-98ab-495d-9cab-ba1f2fbbb112-kube-api-access-nzb9q\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.179422 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.181342 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-config-data" (OuterVolumeSpecName: "config-data") pod "c46a759f-98ab-495d-9cab-ba1f2fbbb112" (UID: "c46a759f-98ab-495d-9cab-ba1f2fbbb112"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.274809 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46a759f-98ab-495d-9cab-ba1f2fbbb112-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.291427 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:11 crc kubenswrapper[4708]: E0320 16:22:11.376705 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.376742 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-ovsdbserver-sb\") pod \"82f1136f-a931-4ae6-b313-d664f25fa111\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.376812 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-dns-svc\") pod \"82f1136f-a931-4ae6-b313-d664f25fa111\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.376915 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-config\") pod \"82f1136f-a931-4ae6-b313-d664f25fa111\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.377003 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb67j\" (UniqueName: \"kubernetes.io/projected/82f1136f-a931-4ae6-b313-d664f25fa111-kube-api-access-hb67j\") pod \"82f1136f-a931-4ae6-b313-d664f25fa111\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.377021 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-dns-swift-storage-0\") pod \"82f1136f-a931-4ae6-b313-d664f25fa111\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.377089 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-ovsdbserver-nb\") pod \"82f1136f-a931-4ae6-b313-d664f25fa111\" (UID: \"82f1136f-a931-4ae6-b313-d664f25fa111\") " Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.389282 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f1136f-a931-4ae6-b313-d664f25fa111-kube-api-access-hb67j" (OuterVolumeSpecName: "kube-api-access-hb67j") pod "82f1136f-a931-4ae6-b313-d664f25fa111" (UID: "82f1136f-a931-4ae6-b313-d664f25fa111"). InnerVolumeSpecName "kube-api-access-hb67j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.457787 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-config" (OuterVolumeSpecName: "config") pod "82f1136f-a931-4ae6-b313-d664f25fa111" (UID: "82f1136f-a931-4ae6-b313-d664f25fa111"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.469561 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "82f1136f-a931-4ae6-b313-d664f25fa111" (UID: "82f1136f-a931-4ae6-b313-d664f25fa111"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.473890 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82f1136f-a931-4ae6-b313-d664f25fa111" (UID: "82f1136f-a931-4ae6-b313-d664f25fa111"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.478856 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.478894 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb67j\" (UniqueName: \"kubernetes.io/projected/82f1136f-a931-4ae6-b313-d664f25fa111-kube-api-access-hb67j\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.478907 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.478916 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.496739 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "82f1136f-a931-4ae6-b313-d664f25fa111" (UID: "82f1136f-a931-4ae6-b313-d664f25fa111"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.502582 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "82f1136f-a931-4ae6-b313-d664f25fa111" (UID: "82f1136f-a931-4ae6-b313-d664f25fa111"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.581046 4708 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.581381 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82f1136f-a931-4ae6-b313-d664f25fa111-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.657314 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg"] Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.672196 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-89cpk"] Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.706189 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cf76bb97c-b4rrf"] Mar 20 16:22:11 crc kubenswrapper[4708]: W0320 16:22:11.711606 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1b82e86_4f8a_4d9c_8b0c_97b8b34cf8f9.slice/crio-e5c29f33e9d8c37e45a8fb873677db465c0d397e85f9e3ccbc9a1e7e60a1033d WatchSource:0}: Error finding container e5c29f33e9d8c37e45a8fb873677db465c0d397e85f9e3ccbc9a1e7e60a1033d: Status 404 returned error can't find the container with id e5c29f33e9d8c37e45a8fb873677db465c0d397e85f9e3ccbc9a1e7e60a1033d Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.897131 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5121a54a-778a-4b46-9726-a4ba2901042b","Type":"ContainerStarted","Data":"e1336b5c2628254a61249c9e185998050e23e1ac1c5b8512af0c96acf39884ac"} Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.897223 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.897228 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" containerName="ceilometer-notification-agent" containerID="cri-o://9bfe310440d497465aa1c16e3fb8dae60cfad012896e7485f69d1e1aa9a863a6" gracePeriod=30 Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.897279 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" containerName="sg-core" containerID="cri-o://baa67816926af97c6686cfc2905d6b9f42787cad9e04af7895c47d3fa120f290" gracePeriod=30 Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.897273 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" containerName="proxy-httpd" containerID="cri-o://e1336b5c2628254a61249c9e185998050e23e1ac1c5b8512af0c96acf39884ac" gracePeriod=30 Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.899242 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-89cpk" event={"ID":"a8c78399-bd25-45db-8b4e-57dcda1870c3","Type":"ContainerStarted","Data":"5c88135fa89f3cd223a68a725966e394bb16b460e01f350415425648c8a76cbb"} Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.906489 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" event={"ID":"82f1136f-a931-4ae6-b313-d664f25fa111","Type":"ContainerDied","Data":"00e556064c2ed55fcc707c37cc6e9cb93ec7a87d9013cd6d05322a66e0142a6a"} Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.906801 4708 scope.go:117] "RemoveContainer" containerID="48e5cef8f7b01e7879b6c288aa56c0804c1c3dfd8377784fcd8c242042ea3282" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.906799 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mp56n" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.909010 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" event={"ID":"8868c62f-1325-4541-96b6-57a48f5b045e","Type":"ContainerStarted","Data":"387870b32ddf337860533a148a93bf11ce2b5d21e3246b40bfefd0aa6887136c"} Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.913938 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fgjlj" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.917912 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cf76bb97c-b4rrf" event={"ID":"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9","Type":"ContainerStarted","Data":"e5c29f33e9d8c37e45a8fb873677db465c0d397e85f9e3ccbc9a1e7e60a1033d"} Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.919829 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.919869 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.936859 4708 scope.go:117] "RemoveContainer" containerID="cc9dcc857a0d2b0a5f16640ffd813552a6aded4a4e14d75f3e39718d6ddb7d19" Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.944318 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cdb79bd7b-v5pg5"] Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.972280 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mp56n"] Mar 20 16:22:11 crc kubenswrapper[4708]: I0320 16:22:11.994697 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mp56n"] Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.083981 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.084026 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.156786 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2014e205-8eb6-4dc9-8fa7-f0935f73019d" path="/var/lib/kubelet/pods/2014e205-8eb6-4dc9-8fa7-f0935f73019d/volumes" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.157537 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82f1136f-a931-4ae6-b313-d664f25fa111" path="/var/lib/kubelet/pods/82f1136f-a931-4ae6-b313-d664f25fa111/volumes" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.173977 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.204081 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.312805 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:22:12 crc kubenswrapper[4708]: E0320 16:22:12.313590 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f1136f-a931-4ae6-b313-d664f25fa111" containerName="dnsmasq-dns" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.313607 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f1136f-a931-4ae6-b313-d664f25fa111" containerName="dnsmasq-dns" Mar 20 16:22:12 crc kubenswrapper[4708]: E0320 16:22:12.313642 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f1136f-a931-4ae6-b313-d664f25fa111" containerName="init" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.313648 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f1136f-a931-4ae6-b313-d664f25fa111" containerName="init" Mar 20 16:22:12 crc kubenswrapper[4708]: E0320 16:22:12.313658 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46a759f-98ab-495d-9cab-ba1f2fbbb112" containerName="cinder-db-sync" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.313684 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46a759f-98ab-495d-9cab-ba1f2fbbb112" containerName="cinder-db-sync" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.313859 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f1136f-a931-4ae6-b313-d664f25fa111" containerName="dnsmasq-dns" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.313870 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46a759f-98ab-495d-9cab-ba1f2fbbb112" containerName="cinder-db-sync" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.315186 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.322232 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kmxkf" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.322543 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.322714 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.332891 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.341474 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.386971 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-89cpk"] Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.409813 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.409884 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.409988 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-config-data\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.410028 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h25hc\" (UniqueName: \"kubernetes.io/projected/9a13968c-5e9f-4c12-8105-8b5258e17cfc-kube-api-access-h25hc\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.410096 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-scripts\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.410129 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a13968c-5e9f-4c12-8105-8b5258e17cfc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.417383 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5dlpz"] Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.420517 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.463727 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5dlpz"] Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512002 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512062 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512138 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512185 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512324 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-config\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512403 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-config-data\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512445 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512470 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h25hc\" (UniqueName: \"kubernetes.io/projected/9a13968c-5e9f-4c12-8105-8b5258e17cfc-kube-api-access-h25hc\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512502 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512553 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs89r\" (UniqueName: \"kubernetes.io/projected/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-kube-api-access-vs89r\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512576 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-scripts\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512637 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a13968c-5e9f-4c12-8105-8b5258e17cfc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.512833 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a13968c-5e9f-4c12-8105-8b5258e17cfc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.519567 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.520319 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-config-data\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.530308 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.530851 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-scripts\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.541240 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h25hc\" (UniqueName: \"kubernetes.io/projected/9a13968c-5e9f-4c12-8105-8b5258e17cfc-kube-api-access-h25hc\") pod \"cinder-scheduler-0\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.577611 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.579429 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.584313 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.607951 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.619805 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-scripts\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.619860 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.619906 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.619940 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-config\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.619982 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.620006 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-config-data-custom\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.620059 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-config-data\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.620091 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.620123 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.620160 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkptn\" (UniqueName: \"kubernetes.io/projected/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-kube-api-access-pkptn\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.620203 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs89r\" (UniqueName: \"kubernetes.io/projected/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-kube-api-access-vs89r\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.620226 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-logs\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.620265 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.621196 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.621825 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.622340 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-config\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.627074 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.627643 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.647580 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs89r\" (UniqueName: \"kubernetes.io/projected/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-kube-api-access-vs89r\") pod \"dnsmasq-dns-5c9776ccc5-5dlpz\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.686036 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.729182 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkptn\" (UniqueName: \"kubernetes.io/projected/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-kube-api-access-pkptn\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.729297 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-logs\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.729385 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-scripts\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.729402 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.729545 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.729571 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-config-data-custom\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.729656 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-config-data\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.736233 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-logs\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.736302 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.737207 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-scripts\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.742503 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-config-data\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.743011 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.750534 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.752614 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkptn\" (UniqueName: \"kubernetes.io/projected/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-kube-api-access-pkptn\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.756373 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-config-data-custom\") pod \"cinder-api-0\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.776555 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6b6ff5cbbd-kjfxp" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.869835 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7bc9dd67b8-mz4lv" podUID="15901de5-ddbe-4c7b-8968-8c614619be4d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.929519 4708 generic.go:334] "Generic (PLEG): container finished" podID="a8c78399-bd25-45db-8b4e-57dcda1870c3" containerID="2ccf93a96ceb500dd146b551ba64468d2a813a00bcd7d62bbaa65b0d3643caab" exitCode=0 Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.929572 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-89cpk" event={"ID":"a8c78399-bd25-45db-8b4e-57dcda1870c3","Type":"ContainerDied","Data":"2ccf93a96ceb500dd146b551ba64468d2a813a00bcd7d62bbaa65b0d3643caab"} Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.933525 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" event={"ID":"f7ab775f-a2d7-43de-9f9d-356a6a86f930","Type":"ContainerStarted","Data":"887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79"} Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.933584 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" event={"ID":"f7ab775f-a2d7-43de-9f9d-356a6a86f930","Type":"ContainerStarted","Data":"a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63"} Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.933597 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" event={"ID":"f7ab775f-a2d7-43de-9f9d-356a6a86f930","Type":"ContainerStarted","Data":"07ef991ff2fd7f224c7da06a02ff1f7c5c8fa07d5c1afa20bcd457198707488d"} Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.934446 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.934477 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.936166 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.943416 4708 generic.go:334] "Generic (PLEG): container finished" podID="5121a54a-778a-4b46-9726-a4ba2901042b" containerID="e1336b5c2628254a61249c9e185998050e23e1ac1c5b8512af0c96acf39884ac" exitCode=0 Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.943456 4708 generic.go:334] "Generic (PLEG): container finished" podID="5121a54a-778a-4b46-9726-a4ba2901042b" containerID="baa67816926af97c6686cfc2905d6b9f42787cad9e04af7895c47d3fa120f290" exitCode=2 Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.943468 4708 generic.go:334] "Generic (PLEG): container finished" podID="5121a54a-778a-4b46-9726-a4ba2901042b" containerID="9bfe310440d497465aa1c16e3fb8dae60cfad012896e7485f69d1e1aa9a863a6" exitCode=0 Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.943734 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5121a54a-778a-4b46-9726-a4ba2901042b","Type":"ContainerDied","Data":"e1336b5c2628254a61249c9e185998050e23e1ac1c5b8512af0c96acf39884ac"} Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.943796 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5121a54a-778a-4b46-9726-a4ba2901042b","Type":"ContainerDied","Data":"baa67816926af97c6686cfc2905d6b9f42787cad9e04af7895c47d3fa120f290"} Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.943813 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5121a54a-778a-4b46-9726-a4ba2901042b","Type":"ContainerDied","Data":"9bfe310440d497465aa1c16e3fb8dae60cfad012896e7485f69d1e1aa9a863a6"} Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.944443 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.944469 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:12 crc kubenswrapper[4708]: I0320 16:22:12.971856 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" podStartSLOduration=2.971841468 podStartE2EDuration="2.971841468s" podCreationTimestamp="2026-03-20 16:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:12.970569224 +0000 UTC m=+1287.644905959" watchObservedRunningTime="2026-03-20 16:22:12.971841468 +0000 UTC m=+1287.646178173" Mar 20 16:22:13 crc kubenswrapper[4708]: E0320 16:22:13.353751 4708 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 20 16:22:13 crc kubenswrapper[4708]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/a8c78399-bd25-45db-8b4e-57dcda1870c3/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 16:22:13 crc kubenswrapper[4708]: > podSandboxID="5c88135fa89f3cd223a68a725966e394bb16b460e01f350415425648c8a76cbb" Mar 20 16:22:13 crc kubenswrapper[4708]: E0320 16:22:13.354461 4708 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 16:22:13 crc kubenswrapper[4708]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7ch57ch5c5hcch589hf7h577h659h96h5c8h5b4h55fhbbh667h565h5bchcbh58dh7dh5bch586h56ch574h598h67dh5c8h56dh8bh574h564hbch7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8xbbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-85ff748b95-89cpk_openstack(a8c78399-bd25-45db-8b4e-57dcda1870c3): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/a8c78399-bd25-45db-8b4e-57dcda1870c3/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 16:22:13 crc kubenswrapper[4708]: > logger="UnhandledError" Mar 20 16:22:13 crc kubenswrapper[4708]: E0320 16:22:13.358355 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/a8c78399-bd25-45db-8b4e-57dcda1870c3/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-85ff748b95-89cpk" podUID="a8c78399-bd25-45db-8b4e-57dcda1870c3" Mar 20 16:22:13 crc kubenswrapper[4708]: I0320 16:22:13.464491 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5dlpz"] Mar 20 16:22:13 crc kubenswrapper[4708]: I0320 16:22:13.799438 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:13 crc kubenswrapper[4708]: I0320 16:22:13.820970 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:22:13 crc kubenswrapper[4708]: I0320 16:22:13.913921 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:22:13 crc kubenswrapper[4708]: I0320 16:22:13.976854 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" event={"ID":"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd","Type":"ContainerStarted","Data":"75dd15ffbfb5cd14a49ac8ae81cd1881c76f359a48b317bfcacce62723be43a3"} Mar 20 16:22:13 crc kubenswrapper[4708]: I0320 16:22:13.992321 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9a13968c-5e9f-4c12-8105-8b5258e17cfc","Type":"ContainerStarted","Data":"aeb3198a8edc9a3058f416e428595d0a803d7f6c86d028d679877f79ac900e38"} Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.007353 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-sg-core-conf-yaml\") pod \"5121a54a-778a-4b46-9726-a4ba2901042b\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.007541 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-combined-ca-bundle\") pod \"5121a54a-778a-4b46-9726-a4ba2901042b\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.007564 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-config-data\") pod \"5121a54a-778a-4b46-9726-a4ba2901042b\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.007610 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5121a54a-778a-4b46-9726-a4ba2901042b-log-httpd\") pod \"5121a54a-778a-4b46-9726-a4ba2901042b\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.007691 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-scripts\") pod \"5121a54a-778a-4b46-9726-a4ba2901042b\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.007727 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5121a54a-778a-4b46-9726-a4ba2901042b-run-httpd\") pod \"5121a54a-778a-4b46-9726-a4ba2901042b\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.007783 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr97h\" (UniqueName: \"kubernetes.io/projected/5121a54a-778a-4b46-9726-a4ba2901042b-kube-api-access-kr97h\") pod \"5121a54a-778a-4b46-9726-a4ba2901042b\" (UID: \"5121a54a-778a-4b46-9726-a4ba2901042b\") " Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.013002 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5121a54a-778a-4b46-9726-a4ba2901042b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5121a54a-778a-4b46-9726-a4ba2901042b" (UID: "5121a54a-778a-4b46-9726-a4ba2901042b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.016379 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5121a54a-778a-4b46-9726-a4ba2901042b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5121a54a-778a-4b46-9726-a4ba2901042b" (UID: "5121a54a-778a-4b46-9726-a4ba2901042b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.038860 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-scripts" (OuterVolumeSpecName: "scripts") pod "5121a54a-778a-4b46-9726-a4ba2901042b" (UID: "5121a54a-778a-4b46-9726-a4ba2901042b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.043009 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5121a54a-778a-4b46-9726-a4ba2901042b","Type":"ContainerDied","Data":"5eac20424b38411cf6b3824526f41b6e8d68e9b2ceb26101fb1c396149ee0572"} Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.043067 4708 scope.go:117] "RemoveContainer" containerID="e1336b5c2628254a61249c9e185998050e23e1ac1c5b8512af0c96acf39884ac" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.043212 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.046921 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5121a54a-778a-4b46-9726-a4ba2901042b-kube-api-access-kr97h" (OuterVolumeSpecName: "kube-api-access-kr97h") pod "5121a54a-778a-4b46-9726-a4ba2901042b" (UID: "5121a54a-778a-4b46-9726-a4ba2901042b"). InnerVolumeSpecName "kube-api-access-kr97h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.066261 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c","Type":"ContainerStarted","Data":"ec5ba7cc3a0fd6d57bd86b709d22c437a0687105d28654f971ff0bb2f86dc0f8"} Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.066361 4708 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.066370 4708 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.086980 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5121a54a-778a-4b46-9726-a4ba2901042b" (UID: "5121a54a-778a-4b46-9726-a4ba2901042b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.099018 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5121a54a-778a-4b46-9726-a4ba2901042b" (UID: "5121a54a-778a-4b46-9726-a4ba2901042b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.115416 4708 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.117003 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.117323 4708 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5121a54a-778a-4b46-9726-a4ba2901042b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.117340 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.117350 4708 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5121a54a-778a-4b46-9726-a4ba2901042b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.117359 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr97h\" (UniqueName: \"kubernetes.io/projected/5121a54a-778a-4b46-9726-a4ba2901042b-kube-api-access-kr97h\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.136195 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-config-data" (OuterVolumeSpecName: "config-data") pod "5121a54a-778a-4b46-9726-a4ba2901042b" (UID: "5121a54a-778a-4b46-9726-a4ba2901042b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.141444 4708 scope.go:117] "RemoveContainer" containerID="baa67816926af97c6686cfc2905d6b9f42787cad9e04af7895c47d3fa120f290" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.197858 4708 scope.go:117] "RemoveContainer" containerID="9bfe310440d497465aa1c16e3fb8dae60cfad012896e7485f69d1e1aa9a863a6" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.220747 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5121a54a-778a-4b46-9726-a4ba2901042b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.413112 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.447582 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.462555 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:14 crc kubenswrapper[4708]: E0320 16:22:14.463021 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" containerName="sg-core" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.463040 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" containerName="sg-core" Mar 20 16:22:14 crc kubenswrapper[4708]: E0320 16:22:14.463076 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" containerName="ceilometer-notification-agent" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.463084 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" containerName="ceilometer-notification-agent" Mar 20 16:22:14 crc kubenswrapper[4708]: E0320 16:22:14.463098 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" containerName="proxy-httpd" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.463107 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" containerName="proxy-httpd" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.463289 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" containerName="sg-core" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.463310 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" containerName="ceilometer-notification-agent" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.463319 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" containerName="proxy-httpd" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.465070 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.473429 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.473772 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.477588 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.627954 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-config-data\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.628013 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-log-httpd\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.628042 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-run-httpd\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.628071 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.628149 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.628179 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-scripts\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.628241 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq66q\" (UniqueName: \"kubernetes.io/projected/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-kube-api-access-qq66q\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.730329 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.731138 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-scripts\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.731199 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq66q\" (UniqueName: \"kubernetes.io/projected/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-kube-api-access-qq66q\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.731299 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-config-data\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.731317 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-log-httpd\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.731339 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-run-httpd\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.731365 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.737077 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.737089 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-log-httpd\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.737113 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-run-httpd\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.737188 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-config-data\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.737701 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.745635 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-scripts\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.752587 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq66q\" (UniqueName: \"kubernetes.io/projected/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-kube-api-access-qq66q\") pod \"ceilometer-0\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.790657 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.880300 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 16:22:14 crc kubenswrapper[4708]: I0320 16:22:14.889086 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.080835 4708 generic.go:334] "Generic (PLEG): container finished" podID="484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" containerID="b555b5cfc5ceb4d5d5a1e7f305edaecc7869b07e6a854782790909894e85b7dc" exitCode=0 Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.080934 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" event={"ID":"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd","Type":"ContainerDied","Data":"b555b5cfc5ceb4d5d5a1e7f305edaecc7869b07e6a854782790909894e85b7dc"} Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.093343 4708 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.093382 4708 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.094014 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c","Type":"ContainerStarted","Data":"b944e61c5c16abc622e9a2e425e1b456cbf6c849ea7f1ff203f30fa8c1fcecd3"} Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.121834 4708 scope.go:117] "RemoveContainer" containerID="419a0b72c831d2af0889a6f5356970735fbd438b4513aef69a146e4d0e01dea4" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.494149 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.607740 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xbbz\" (UniqueName: \"kubernetes.io/projected/a8c78399-bd25-45db-8b4e-57dcda1870c3-kube-api-access-8xbbz\") pod \"a8c78399-bd25-45db-8b4e-57dcda1870c3\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.607888 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-ovsdbserver-sb\") pod \"a8c78399-bd25-45db-8b4e-57dcda1870c3\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.607938 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-dns-swift-storage-0\") pod \"a8c78399-bd25-45db-8b4e-57dcda1870c3\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.607992 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-dns-svc\") pod \"a8c78399-bd25-45db-8b4e-57dcda1870c3\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.608026 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-ovsdbserver-nb\") pod \"a8c78399-bd25-45db-8b4e-57dcda1870c3\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.608067 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-config\") pod \"a8c78399-bd25-45db-8b4e-57dcda1870c3\" (UID: \"a8c78399-bd25-45db-8b4e-57dcda1870c3\") " Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.616536 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c78399-bd25-45db-8b4e-57dcda1870c3-kube-api-access-8xbbz" (OuterVolumeSpecName: "kube-api-access-8xbbz") pod "a8c78399-bd25-45db-8b4e-57dcda1870c3" (UID: "a8c78399-bd25-45db-8b4e-57dcda1870c3"). InnerVolumeSpecName "kube-api-access-8xbbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.661509 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8c78399-bd25-45db-8b4e-57dcda1870c3" (UID: "a8c78399-bd25-45db-8b4e-57dcda1870c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.667324 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-config" (OuterVolumeSpecName: "config") pod "a8c78399-bd25-45db-8b4e-57dcda1870c3" (UID: "a8c78399-bd25-45db-8b4e-57dcda1870c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.673429 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a8c78399-bd25-45db-8b4e-57dcda1870c3" (UID: "a8c78399-bd25-45db-8b4e-57dcda1870c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.675040 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a8c78399-bd25-45db-8b4e-57dcda1870c3" (UID: "a8c78399-bd25-45db-8b4e-57dcda1870c3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.716400 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a8c78399-bd25-45db-8b4e-57dcda1870c3" (UID: "a8c78399-bd25-45db-8b4e-57dcda1870c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.722314 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.722353 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xbbz\" (UniqueName: \"kubernetes.io/projected/a8c78399-bd25-45db-8b4e-57dcda1870c3-kube-api-access-8xbbz\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.722363 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.722553 4708 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.722567 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.722575 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8c78399-bd25-45db-8b4e-57dcda1870c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.876850 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.885573 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.909662 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:15 crc kubenswrapper[4708]: I0320 16:22:15.913143 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:16 crc kubenswrapper[4708]: I0320 16:22:16.137608 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5121a54a-778a-4b46-9726-a4ba2901042b" path="/var/lib/kubelet/pods/5121a54a-778a-4b46-9726-a4ba2901042b/volumes" Mar 20 16:22:16 crc kubenswrapper[4708]: I0320 16:22:16.138738 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3","Type":"ContainerStarted","Data":"3ac7bd427a72918b86e3d0a88283e17fe850b07e8d5f0a4b5dce4cb0850a458d"} Mar 20 16:22:16 crc kubenswrapper[4708]: I0320 16:22:16.140577 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-89cpk" Mar 20 16:22:16 crc kubenswrapper[4708]: I0320 16:22:16.142730 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-89cpk" event={"ID":"a8c78399-bd25-45db-8b4e-57dcda1870c3","Type":"ContainerDied","Data":"5c88135fa89f3cd223a68a725966e394bb16b460e01f350415425648c8a76cbb"} Mar 20 16:22:16 crc kubenswrapper[4708]: I0320 16:22:16.143074 4708 scope.go:117] "RemoveContainer" containerID="2ccf93a96ceb500dd146b551ba64468d2a813a00bcd7d62bbaa65b0d3643caab" Mar 20 16:22:16 crc kubenswrapper[4708]: I0320 16:22:16.326635 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-89cpk"] Mar 20 16:22:16 crc kubenswrapper[4708]: I0320 16:22:16.335201 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-89cpk"] Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.158976 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cf76bb97c-b4rrf" event={"ID":"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9","Type":"ContainerStarted","Data":"809d62fca736cd4c720d05734318247e73a24de537fce7c63cd75c2c074622e5"} Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.159258 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cf76bb97c-b4rrf" event={"ID":"f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9","Type":"ContainerStarted","Data":"5a8be8d9e75e2dabd94ca19ebea60289690ae31cec9cb433dfcc8d682629ab34"} Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.162983 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9a13968c-5e9f-4c12-8105-8b5258e17cfc","Type":"ContainerStarted","Data":"3c3ff13e6488407829f3792db642c16be32c33ccf6e1ce9696139f5df305bf7c"} Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.171228 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" event={"ID":"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd","Type":"ContainerStarted","Data":"a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316"} Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.171527 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.176852 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" event={"ID":"8868c62f-1325-4541-96b6-57a48f5b045e","Type":"ContainerStarted","Data":"fbf0f38f33a822678b7f4f68ad45c5e7d0c8efd867fc97cead1bdcdb6697d2a2"} Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.176924 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" event={"ID":"8868c62f-1325-4541-96b6-57a48f5b045e","Type":"ContainerStarted","Data":"aa67e02f1005b45d16e6ee751ad46349b07960acd90bd6117f86fef4e156cc3c"} Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.185106 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" containerName="cinder-api-log" containerID="cri-o://b944e61c5c16abc622e9a2e425e1b456cbf6c849ea7f1ff203f30fa8c1fcecd3" gracePeriod=30 Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.185213 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c","Type":"ContainerStarted","Data":"4ba8669b02f07a77d6f20179a54d39ad8e036c49843f75706ab61fd4849e1aec"} Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.185266 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.185306 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" containerName="cinder-api" containerID="cri-o://4ba8669b02f07a77d6f20179a54d39ad8e036c49843f75706ab61fd4849e1aec" gracePeriod=30 Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.205652 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" podStartSLOduration=5.205635404 podStartE2EDuration="5.205635404s" podCreationTimestamp="2026-03-20 16:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:17.204237276 +0000 UTC m=+1291.878573991" watchObservedRunningTime="2026-03-20 16:22:17.205635404 +0000 UTC m=+1291.879972119" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.210136 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7cf76bb97c-b4rrf" podStartSLOduration=3.389938194 podStartE2EDuration="7.210114526s" podCreationTimestamp="2026-03-20 16:22:10 +0000 UTC" firstStartedPulling="2026-03-20 16:22:11.714282171 +0000 UTC m=+1286.388618886" lastFinishedPulling="2026-03-20 16:22:15.534458503 +0000 UTC m=+1290.208795218" observedRunningTime="2026-03-20 16:22:17.177987797 +0000 UTC m=+1291.852324522" watchObservedRunningTime="2026-03-20 16:22:17.210114526 +0000 UTC m=+1291.884451251" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.250727 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5f5dc8d9b6-dv4pg" podStartSLOduration=3.397252985 podStartE2EDuration="7.250704678s" podCreationTimestamp="2026-03-20 16:22:10 +0000 UTC" firstStartedPulling="2026-03-20 16:22:11.677577587 +0000 UTC m=+1286.351914302" lastFinishedPulling="2026-03-20 16:22:15.53102928 +0000 UTC m=+1290.205365995" observedRunningTime="2026-03-20 16:22:17.244993361 +0000 UTC m=+1291.919330076" watchObservedRunningTime="2026-03-20 16:22:17.250704678 +0000 UTC m=+1291.925041413" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.281134 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.2811176 podStartE2EDuration="5.2811176s" podCreationTimestamp="2026-03-20 16:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:17.263729894 +0000 UTC m=+1291.938066629" watchObservedRunningTime="2026-03-20 16:22:17.2811176 +0000 UTC m=+1291.955454315" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.605765 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c9d458d9b-7t7xq"] Mar 20 16:22:17 crc kubenswrapper[4708]: E0320 16:22:17.606328 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c78399-bd25-45db-8b4e-57dcda1870c3" containerName="init" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.606344 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c78399-bd25-45db-8b4e-57dcda1870c3" containerName="init" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.606697 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c78399-bd25-45db-8b4e-57dcda1870c3" containerName="init" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.608272 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.611304 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c9d458d9b-7t7xq"] Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.612066 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.612256 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.683893 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwlkn\" (UniqueName: \"kubernetes.io/projected/e3c707e2-14a0-493e-88f8-81760da73840-kube-api-access-zwlkn\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.684708 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-combined-ca-bundle\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.684955 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-config-data-custom\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.685060 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c707e2-14a0-493e-88f8-81760da73840-logs\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.685184 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-config-data\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.685461 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-public-tls-certs\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.685513 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-internal-tls-certs\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.789955 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwlkn\" (UniqueName: \"kubernetes.io/projected/e3c707e2-14a0-493e-88f8-81760da73840-kube-api-access-zwlkn\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.790319 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-combined-ca-bundle\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.791061 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-config-data-custom\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.791112 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c707e2-14a0-493e-88f8-81760da73840-logs\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.791169 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-config-data\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.791306 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-public-tls-certs\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.791327 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-internal-tls-certs\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.791728 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c707e2-14a0-493e-88f8-81760da73840-logs\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.796173 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-combined-ca-bundle\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.797490 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-internal-tls-certs\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.798312 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-config-data-custom\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.799032 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-public-tls-certs\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.803288 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c707e2-14a0-493e-88f8-81760da73840-config-data\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.810972 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwlkn\" (UniqueName: \"kubernetes.io/projected/e3c707e2-14a0-493e-88f8-81760da73840-kube-api-access-zwlkn\") pod \"barbican-api-7c9d458d9b-7t7xq\" (UID: \"e3c707e2-14a0-493e-88f8-81760da73840\") " pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:17 crc kubenswrapper[4708]: I0320 16:22:17.944485 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:18 crc kubenswrapper[4708]: I0320 16:22:18.194715 4708 generic.go:334] "Generic (PLEG): container finished" podID="00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" containerID="b944e61c5c16abc622e9a2e425e1b456cbf6c849ea7f1ff203f30fa8c1fcecd3" exitCode=143 Mar 20 16:22:18 crc kubenswrapper[4708]: I0320 16:22:18.223224 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c78399-bd25-45db-8b4e-57dcda1870c3" path="/var/lib/kubelet/pods/a8c78399-bd25-45db-8b4e-57dcda1870c3/volumes" Mar 20 16:22:18 crc kubenswrapper[4708]: I0320 16:22:18.224170 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c","Type":"ContainerDied","Data":"b944e61c5c16abc622e9a2e425e1b456cbf6c849ea7f1ff203f30fa8c1fcecd3"} Mar 20 16:22:18 crc kubenswrapper[4708]: W0320 16:22:18.597748 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3c707e2_14a0_493e_88f8_81760da73840.slice/crio-594318afc93aa94b45266637d22f67307412ca05f4e61cf3b19fdd5edd9dfae5 WatchSource:0}: Error finding container 594318afc93aa94b45266637d22f67307412ca05f4e61cf3b19fdd5edd9dfae5: Status 404 returned error can't find the container with id 594318afc93aa94b45266637d22f67307412ca05f4e61cf3b19fdd5edd9dfae5 Mar 20 16:22:18 crc kubenswrapper[4708]: I0320 16:22:18.598723 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c9d458d9b-7t7xq"] Mar 20 16:22:19 crc kubenswrapper[4708]: I0320 16:22:19.206749 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c9d458d9b-7t7xq" event={"ID":"e3c707e2-14a0-493e-88f8-81760da73840","Type":"ContainerStarted","Data":"93152c2663b810282288e78e8d5cba83de0f5baf739a605e01ae9b2fed303076"} Mar 20 16:22:19 crc kubenswrapper[4708]: I0320 16:22:19.206796 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c9d458d9b-7t7xq" event={"ID":"e3c707e2-14a0-493e-88f8-81760da73840","Type":"ContainerStarted","Data":"023dbba28d8848dd5a18505a66a6454c81fb2049b68e201b31affeaa40d46401"} Mar 20 16:22:19 crc kubenswrapper[4708]: I0320 16:22:19.206809 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c9d458d9b-7t7xq" event={"ID":"e3c707e2-14a0-493e-88f8-81760da73840","Type":"ContainerStarted","Data":"594318afc93aa94b45266637d22f67307412ca05f4e61cf3b19fdd5edd9dfae5"} Mar 20 16:22:19 crc kubenswrapper[4708]: I0320 16:22:19.208480 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9a13968c-5e9f-4c12-8105-8b5258e17cfc","Type":"ContainerStarted","Data":"cf91833cf125fcd6f1722339530c383e9d3dc95ca2d18ab88bd2fe109ff77102"} Mar 20 16:22:19 crc kubenswrapper[4708]: I0320 16:22:19.212209 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3","Type":"ContainerStarted","Data":"59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875"} Mar 20 16:22:19 crc kubenswrapper[4708]: I0320 16:22:19.237459 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.15402106 podStartE2EDuration="7.237440007s" podCreationTimestamp="2026-03-20 16:22:12 +0000 UTC" firstStartedPulling="2026-03-20 16:22:13.900878342 +0000 UTC m=+1288.575215057" lastFinishedPulling="2026-03-20 16:22:15.984297289 +0000 UTC m=+1290.658634004" observedRunningTime="2026-03-20 16:22:19.228545644 +0000 UTC m=+1293.902882359" watchObservedRunningTime="2026-03-20 16:22:19.237440007 +0000 UTC m=+1293.911776722" Mar 20 16:22:20 crc kubenswrapper[4708]: I0320 16:22:20.227285 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3","Type":"ContainerStarted","Data":"e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51"} Mar 20 16:22:20 crc kubenswrapper[4708]: I0320 16:22:20.227784 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:20 crc kubenswrapper[4708]: I0320 16:22:20.227813 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:20 crc kubenswrapper[4708]: I0320 16:22:20.263360 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c9d458d9b-7t7xq" podStartSLOduration=3.263334393 podStartE2EDuration="3.263334393s" podCreationTimestamp="2026-03-20 16:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:20.249601657 +0000 UTC m=+1294.923938372" watchObservedRunningTime="2026-03-20 16:22:20.263334393 +0000 UTC m=+1294.937671098" Mar 20 16:22:21 crc kubenswrapper[4708]: I0320 16:22:21.243114 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3","Type":"ContainerStarted","Data":"bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd"} Mar 20 16:22:22 crc kubenswrapper[4708]: I0320 16:22:22.403227 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:22 crc kubenswrapper[4708]: I0320 16:22:22.475792 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:22 crc kubenswrapper[4708]: I0320 16:22:22.688064 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 16:22:22 crc kubenswrapper[4708]: I0320 16:22:22.745748 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:22:22 crc kubenswrapper[4708]: I0320 16:22:22.824839 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-w5nl5"] Mar 20 16:22:22 crc kubenswrapper[4708]: I0320 16:22:22.825585 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" podUID="11974898-2dd8-4e18-9d89-64442e4dce69" containerName="dnsmasq-dns" containerID="cri-o://f84bd0d3b9c8131f026032beb565feac8cb56887349ac08bac72d55be8487821" gracePeriod=10 Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.127433 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.285826 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3","Type":"ContainerStarted","Data":"456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea"} Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.286714 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.300891 4708 generic.go:334] "Generic (PLEG): container finished" podID="11974898-2dd8-4e18-9d89-64442e4dce69" containerID="f84bd0d3b9c8131f026032beb565feac8cb56887349ac08bac72d55be8487821" exitCode=0 Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.303613 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" event={"ID":"11974898-2dd8-4e18-9d89-64442e4dce69","Type":"ContainerDied","Data":"f84bd0d3b9c8131f026032beb565feac8cb56887349ac08bac72d55be8487821"} Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.324006 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.726750536 podStartE2EDuration="9.323970042s" podCreationTimestamp="2026-03-20 16:22:14 +0000 UTC" firstStartedPulling="2026-03-20 16:22:15.984446943 +0000 UTC m=+1290.658783658" lastFinishedPulling="2026-03-20 16:22:22.581666459 +0000 UTC m=+1297.256003164" observedRunningTime="2026-03-20 16:22:23.31879403 +0000 UTC m=+1297.993130765" watchObservedRunningTime="2026-03-20 16:22:23.323970042 +0000 UTC m=+1297.998306757" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.402815 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.554102 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.642246 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-ovsdbserver-sb\") pod \"11974898-2dd8-4e18-9d89-64442e4dce69\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.642346 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-config\") pod \"11974898-2dd8-4e18-9d89-64442e4dce69\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.642419 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-dns-svc\") pod \"11974898-2dd8-4e18-9d89-64442e4dce69\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.642490 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w9pj\" (UniqueName: \"kubernetes.io/projected/11974898-2dd8-4e18-9d89-64442e4dce69-kube-api-access-8w9pj\") pod \"11974898-2dd8-4e18-9d89-64442e4dce69\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.642602 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-ovsdbserver-nb\") pod \"11974898-2dd8-4e18-9d89-64442e4dce69\" (UID: \"11974898-2dd8-4e18-9d89-64442e4dce69\") " Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.661065 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11974898-2dd8-4e18-9d89-64442e4dce69-kube-api-access-8w9pj" (OuterVolumeSpecName: "kube-api-access-8w9pj") pod "11974898-2dd8-4e18-9d89-64442e4dce69" (UID: "11974898-2dd8-4e18-9d89-64442e4dce69"). InnerVolumeSpecName "kube-api-access-8w9pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.746550 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w9pj\" (UniqueName: \"kubernetes.io/projected/11974898-2dd8-4e18-9d89-64442e4dce69-kube-api-access-8w9pj\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.774782 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11974898-2dd8-4e18-9d89-64442e4dce69" (UID: "11974898-2dd8-4e18-9d89-64442e4dce69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.782705 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "11974898-2dd8-4e18-9d89-64442e4dce69" (UID: "11974898-2dd8-4e18-9d89-64442e4dce69"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.792465 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-config" (OuterVolumeSpecName: "config") pod "11974898-2dd8-4e18-9d89-64442e4dce69" (UID: "11974898-2dd8-4e18-9d89-64442e4dce69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.800604 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "11974898-2dd8-4e18-9d89-64442e4dce69" (UID: "11974898-2dd8-4e18-9d89-64442e4dce69"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.849324 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.849372 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.849386 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:23 crc kubenswrapper[4708]: I0320 16:22:23.849398 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11974898-2dd8-4e18-9d89-64442e4dce69-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:24 crc kubenswrapper[4708]: E0320 16:22:24.252259 4708 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11974898_2dd8_4e18_9d89_64442e4dce69.slice/crio-d5e61fae0342423cb2f46edbd46b91181877e9ffe8240c18e98ef90a03ad32d2\": RecentStats: unable to find data in memory cache]" Mar 20 16:22:24 crc kubenswrapper[4708]: I0320 16:22:24.320815 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" event={"ID":"11974898-2dd8-4e18-9d89-64442e4dce69","Type":"ContainerDied","Data":"d5e61fae0342423cb2f46edbd46b91181877e9ffe8240c18e98ef90a03ad32d2"} Mar 20 16:22:24 crc kubenswrapper[4708]: I0320 16:22:24.320879 4708 scope.go:117] "RemoveContainer" containerID="f84bd0d3b9c8131f026032beb565feac8cb56887349ac08bac72d55be8487821" Mar 20 16:22:24 crc kubenswrapper[4708]: I0320 16:22:24.321081 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-w5nl5" Mar 20 16:22:24 crc kubenswrapper[4708]: I0320 16:22:24.322855 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9a13968c-5e9f-4c12-8105-8b5258e17cfc" containerName="cinder-scheduler" containerID="cri-o://3c3ff13e6488407829f3792db642c16be32c33ccf6e1ce9696139f5df305bf7c" gracePeriod=30 Mar 20 16:22:24 crc kubenswrapper[4708]: I0320 16:22:24.323015 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9a13968c-5e9f-4c12-8105-8b5258e17cfc" containerName="probe" containerID="cri-o://cf91833cf125fcd6f1722339530c383e9d3dc95ca2d18ab88bd2fe109ff77102" gracePeriod=30 Mar 20 16:22:24 crc kubenswrapper[4708]: I0320 16:22:24.354708 4708 scope.go:117] "RemoveContainer" containerID="afd99ff6ac9de8ab71d274a0212a70d25d2ad98998cf887508a60041c2ce366b" Mar 20 16:22:24 crc kubenswrapper[4708]: I0320 16:22:24.371450 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-w5nl5"] Mar 20 16:22:24 crc kubenswrapper[4708]: I0320 16:22:24.378644 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-w5nl5"] Mar 20 16:22:25 crc kubenswrapper[4708]: I0320 16:22:25.317647 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:22:25 crc kubenswrapper[4708]: I0320 16:22:25.333253 4708 generic.go:334] "Generic (PLEG): container finished" podID="9a13968c-5e9f-4c12-8105-8b5258e17cfc" containerID="cf91833cf125fcd6f1722339530c383e9d3dc95ca2d18ab88bd2fe109ff77102" exitCode=0 Mar 20 16:22:25 crc kubenswrapper[4708]: I0320 16:22:25.333302 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9a13968c-5e9f-4c12-8105-8b5258e17cfc","Type":"ContainerDied","Data":"cf91833cf125fcd6f1722339530c383e9d3dc95ca2d18ab88bd2fe109ff77102"} Mar 20 16:22:25 crc kubenswrapper[4708]: I0320 16:22:25.429439 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:22:25 crc kubenswrapper[4708]: I0320 16:22:25.985369 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 16:22:26 crc kubenswrapper[4708]: I0320 16:22:26.128569 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11974898-2dd8-4e18-9d89-64442e4dce69" path="/var/lib/kubelet/pods/11974898-2dd8-4e18-9d89-64442e4dce69/volumes" Mar 20 16:22:26 crc kubenswrapper[4708]: I0320 16:22:26.840791 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:22:26 crc kubenswrapper[4708]: I0320 16:22:26.843651 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.217149 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-65454bf644-7xssx"] Mar 20 16:22:27 crc kubenswrapper[4708]: E0320 16:22:27.217541 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11974898-2dd8-4e18-9d89-64442e4dce69" containerName="init" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.217554 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="11974898-2dd8-4e18-9d89-64442e4dce69" containerName="init" Mar 20 16:22:27 crc kubenswrapper[4708]: E0320 16:22:27.217577 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11974898-2dd8-4e18-9d89-64442e4dce69" containerName="dnsmasq-dns" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.217583 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="11974898-2dd8-4e18-9d89-64442e4dce69" containerName="dnsmasq-dns" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.217772 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="11974898-2dd8-4e18-9d89-64442e4dce69" containerName="dnsmasq-dns" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.218660 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.262647 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65454bf644-7xssx"] Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.327990 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-config-data\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.328084 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-public-tls-certs\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.328184 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-internal-tls-certs\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.328234 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-scripts\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.328294 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74b0f3e-01c6-478c-9691-f48fab32af12-logs\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.328329 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8q2g\" (UniqueName: \"kubernetes.io/projected/a74b0f3e-01c6-478c-9691-f48fab32af12-kube-api-access-j8q2g\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.328373 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-combined-ca-bundle\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.430342 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-internal-tls-certs\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.430420 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-scripts\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.430475 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74b0f3e-01c6-478c-9691-f48fab32af12-logs\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.430503 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8q2g\" (UniqueName: \"kubernetes.io/projected/a74b0f3e-01c6-478c-9691-f48fab32af12-kube-api-access-j8q2g\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.430532 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-combined-ca-bundle\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.430566 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-config-data\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.430606 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-public-tls-certs\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.432686 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74b0f3e-01c6-478c-9691-f48fab32af12-logs\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.437749 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-public-tls-certs\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.438968 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-scripts\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.439333 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-config-data\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.445617 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-combined-ca-bundle\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.446151 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a74b0f3e-01c6-478c-9691-f48fab32af12-internal-tls-certs\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.449177 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8q2g\" (UniqueName: \"kubernetes.io/projected/a74b0f3e-01c6-478c-9691-f48fab32af12-kube-api-access-j8q2g\") pod \"placement-65454bf644-7xssx\" (UID: \"a74b0f3e-01c6-478c-9691-f48fab32af12\") " pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.522072 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7bc9dd67b8-mz4lv" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.558057 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.598295 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b6ff5cbbd-kjfxp"] Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.598533 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6b6ff5cbbd-kjfxp" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon-log" containerID="cri-o://79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e" gracePeriod=30 Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.598581 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6b6ff5cbbd-kjfxp" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon" containerID="cri-o://ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c" gracePeriod=30 Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.613739 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6b6ff5cbbd-kjfxp" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 20 16:22:27 crc kubenswrapper[4708]: I0320 16:22:27.619602 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6b6ff5cbbd-kjfxp" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 20 16:22:28 crc kubenswrapper[4708]: I0320 16:22:28.163028 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65454bf644-7xssx"] Mar 20 16:22:28 crc kubenswrapper[4708]: I0320 16:22:28.369139 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65454bf644-7xssx" event={"ID":"a74b0f3e-01c6-478c-9691-f48fab32af12","Type":"ContainerStarted","Data":"60dc426fea3f2d775a309dca0a6d63e15235502c7b609e0af1eac358dca76365"} Mar 20 16:22:28 crc kubenswrapper[4708]: I0320 16:22:28.623904 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.379774 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65454bf644-7xssx" event={"ID":"a74b0f3e-01c6-478c-9691-f48fab32af12","Type":"ContainerStarted","Data":"5c75dd412841693146f89657b0ef96aafc31f9b9fe586ad75cf620336dbd6208"} Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.380267 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65454bf644-7xssx" event={"ID":"a74b0f3e-01c6-478c-9691-f48fab32af12","Type":"ContainerStarted","Data":"013a7aade0f182c81d3b68802b9a48d65c1c15b7f9e570e40c0542a67601b9f8"} Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.380295 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.381986 4708 generic.go:334] "Generic (PLEG): container finished" podID="9a13968c-5e9f-4c12-8105-8b5258e17cfc" containerID="3c3ff13e6488407829f3792db642c16be32c33ccf6e1ce9696139f5df305bf7c" exitCode=0 Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.382020 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9a13968c-5e9f-4c12-8105-8b5258e17cfc","Type":"ContainerDied","Data":"3c3ff13e6488407829f3792db642c16be32c33ccf6e1ce9696139f5df305bf7c"} Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.420170 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-65454bf644-7xssx" podStartSLOduration=2.420148302 podStartE2EDuration="2.420148302s" podCreationTimestamp="2026-03-20 16:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:29.414733373 +0000 UTC m=+1304.089070098" watchObservedRunningTime="2026-03-20 16:22:29.420148302 +0000 UTC m=+1304.094485017" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.610009 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.717407 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.774878 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a13968c-5e9f-4c12-8105-8b5258e17cfc-etc-machine-id\") pod \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.774951 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-config-data-custom\") pod \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.774988 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-config-data\") pod \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.775003 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a13968c-5e9f-4c12-8105-8b5258e17cfc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9a13968c-5e9f-4c12-8105-8b5258e17cfc" (UID: "9a13968c-5e9f-4c12-8105-8b5258e17cfc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.775066 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h25hc\" (UniqueName: \"kubernetes.io/projected/9a13968c-5e9f-4c12-8105-8b5258e17cfc-kube-api-access-h25hc\") pod \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.775143 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-combined-ca-bundle\") pod \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.775182 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-scripts\") pod \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\" (UID: \"9a13968c-5e9f-4c12-8105-8b5258e17cfc\") " Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.776138 4708 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a13968c-5e9f-4c12-8105-8b5258e17cfc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.801512 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-scripts" (OuterVolumeSpecName: "scripts") pod "9a13968c-5e9f-4c12-8105-8b5258e17cfc" (UID: "9a13968c-5e9f-4c12-8105-8b5258e17cfc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.801555 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a13968c-5e9f-4c12-8105-8b5258e17cfc-kube-api-access-h25hc" (OuterVolumeSpecName: "kube-api-access-h25hc") pod "9a13968c-5e9f-4c12-8105-8b5258e17cfc" (UID: "9a13968c-5e9f-4c12-8105-8b5258e17cfc"). InnerVolumeSpecName "kube-api-access-h25hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.805924 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9a13968c-5e9f-4c12-8105-8b5258e17cfc" (UID: "9a13968c-5e9f-4c12-8105-8b5258e17cfc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.851186 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a13968c-5e9f-4c12-8105-8b5258e17cfc" (UID: "9a13968c-5e9f-4c12-8105-8b5258e17cfc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.882223 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h25hc\" (UniqueName: \"kubernetes.io/projected/9a13968c-5e9f-4c12-8105-8b5258e17cfc-kube-api-access-h25hc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.882261 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.882273 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.882303 4708 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.890885 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-config-data" (OuterVolumeSpecName: "config-data") pod "9a13968c-5e9f-4c12-8105-8b5258e17cfc" (UID: "9a13968c-5e9f-4c12-8105-8b5258e17cfc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:29 crc kubenswrapper[4708]: I0320 16:22:29.984185 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a13968c-5e9f-4c12-8105-8b5258e17cfc-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.168636 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c9d458d9b-7t7xq" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.275994 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6cdb79bd7b-v5pg5"] Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.276653 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" podUID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" containerName="barbican-api-log" containerID="cri-o://a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63" gracePeriod=30 Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.277536 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" podUID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" containerName="barbican-api" containerID="cri-o://887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79" gracePeriod=30 Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.343931 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7bd9698484-kk2kq" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.409774 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9a13968c-5e9f-4c12-8105-8b5258e17cfc","Type":"ContainerDied","Data":"aeb3198a8edc9a3058f416e428595d0a803d7f6c86d028d679877f79ac900e38"} Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.409849 4708 scope.go:117] "RemoveContainer" containerID="cf91833cf125fcd6f1722339530c383e9d3dc95ca2d18ab88bd2fe109ff77102" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.410038 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.411460 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.457599 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.473249 4708 scope.go:117] "RemoveContainer" containerID="3c3ff13e6488407829f3792db642c16be32c33ccf6e1ce9696139f5df305bf7c" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.488091 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.520408 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:22:30 crc kubenswrapper[4708]: E0320 16:22:30.520964 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a13968c-5e9f-4c12-8105-8b5258e17cfc" containerName="probe" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.520987 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a13968c-5e9f-4c12-8105-8b5258e17cfc" containerName="probe" Mar 20 16:22:30 crc kubenswrapper[4708]: E0320 16:22:30.521007 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a13968c-5e9f-4c12-8105-8b5258e17cfc" containerName="cinder-scheduler" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.521015 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a13968c-5e9f-4c12-8105-8b5258e17cfc" containerName="cinder-scheduler" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.521262 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a13968c-5e9f-4c12-8105-8b5258e17cfc" containerName="cinder-scheduler" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.521283 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a13968c-5e9f-4c12-8105-8b5258e17cfc" containerName="probe" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.522494 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.528797 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.530185 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.603278 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-config-data\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.603389 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68kj9\" (UniqueName: \"kubernetes.io/projected/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-kube-api-access-68kj9\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.603630 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.603663 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.603803 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-scripts\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.604038 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.706270 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-config-data\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.706333 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68kj9\" (UniqueName: \"kubernetes.io/projected/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-kube-api-access-68kj9\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.706413 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.706430 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.706458 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-scripts\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.706517 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.706530 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.713760 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.714044 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.714519 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-config-data\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.715117 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-scripts\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.736394 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68kj9\" (UniqueName: \"kubernetes.io/projected/c5c890eb-8583-4d2f-bf71-55e9f83b51d6-kube-api-access-68kj9\") pod \"cinder-scheduler-0\" (UID: \"c5c890eb-8583-4d2f-bf71-55e9f83b51d6\") " pod="openstack/cinder-scheduler-0" Mar 20 16:22:30 crc kubenswrapper[4708]: I0320 16:22:30.853978 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 16:22:31 crc kubenswrapper[4708]: I0320 16:22:31.233163 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-9f694fd9c-lggtm" Mar 20 16:22:31 crc kubenswrapper[4708]: I0320 16:22:31.329715 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-68db5b9d4d-q2n5k"] Mar 20 16:22:31 crc kubenswrapper[4708]: I0320 16:22:31.330085 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-68db5b9d4d-q2n5k" podUID="bf812fb4-4e10-42ae-bc52-bc08b8749d29" containerName="neutron-api" containerID="cri-o://d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad" gracePeriod=30 Mar 20 16:22:31 crc kubenswrapper[4708]: I0320 16:22:31.330602 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-68db5b9d4d-q2n5k" podUID="bf812fb4-4e10-42ae-bc52-bc08b8749d29" containerName="neutron-httpd" containerID="cri-o://fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c" gracePeriod=30 Mar 20 16:22:31 crc kubenswrapper[4708]: I0320 16:22:31.373465 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 16:22:31 crc kubenswrapper[4708]: W0320 16:22:31.381115 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5c890eb_8583_4d2f_bf71_55e9f83b51d6.slice/crio-32512309ff046e220ff714ac65f279cb6af25120ed53697a92bff05981cbd42e WatchSource:0}: Error finding container 32512309ff046e220ff714ac65f279cb6af25120ed53697a92bff05981cbd42e: Status 404 returned error can't find the container with id 32512309ff046e220ff714ac65f279cb6af25120ed53697a92bff05981cbd42e Mar 20 16:22:31 crc kubenswrapper[4708]: I0320 16:22:31.431550 4708 generic.go:334] "Generic (PLEG): container finished" podID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" containerID="a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63" exitCode=143 Mar 20 16:22:31 crc kubenswrapper[4708]: I0320 16:22:31.431811 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" event={"ID":"f7ab775f-a2d7-43de-9f9d-356a6a86f930","Type":"ContainerDied","Data":"a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63"} Mar 20 16:22:31 crc kubenswrapper[4708]: I0320 16:22:31.440245 4708 generic.go:334] "Generic (PLEG): container finished" podID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerID="ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c" exitCode=0 Mar 20 16:22:31 crc kubenswrapper[4708]: I0320 16:22:31.440317 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b6ff5cbbd-kjfxp" event={"ID":"48bfe1e4-0a34-4af1-badd-c445d2c02ce1","Type":"ContainerDied","Data":"ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c"} Mar 20 16:22:31 crc kubenswrapper[4708]: I0320 16:22:31.443496 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c5c890eb-8583-4d2f-bf71-55e9f83b51d6","Type":"ContainerStarted","Data":"32512309ff046e220ff714ac65f279cb6af25120ed53697a92bff05981cbd42e"} Mar 20 16:22:32 crc kubenswrapper[4708]: I0320 16:22:32.126574 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a13968c-5e9f-4c12-8105-8b5258e17cfc" path="/var/lib/kubelet/pods/9a13968c-5e9f-4c12-8105-8b5258e17cfc/volumes" Mar 20 16:22:32 crc kubenswrapper[4708]: I0320 16:22:32.460890 4708 generic.go:334] "Generic (PLEG): container finished" podID="bf812fb4-4e10-42ae-bc52-bc08b8749d29" containerID="fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c" exitCode=0 Mar 20 16:22:32 crc kubenswrapper[4708]: I0320 16:22:32.462986 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68db5b9d4d-q2n5k" event={"ID":"bf812fb4-4e10-42ae-bc52-bc08b8749d29","Type":"ContainerDied","Data":"fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c"} Mar 20 16:22:32 crc kubenswrapper[4708]: I0320 16:22:32.477384 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c5c890eb-8583-4d2f-bf71-55e9f83b51d6","Type":"ContainerStarted","Data":"f22fe1c62e4d070d7b3f92b9e5c86f475a02c18bb08d4fd9c8a18db99ac0cc42"} Mar 20 16:22:32 crc kubenswrapper[4708]: I0320 16:22:32.767172 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6b6ff5cbbd-kjfxp" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.162295 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.279692 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-httpd-config\") pod \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.279752 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-ovndb-tls-certs\") pod \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.279815 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-combined-ca-bundle\") pod \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.279946 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpf7t\" (UniqueName: \"kubernetes.io/projected/bf812fb4-4e10-42ae-bc52-bc08b8749d29-kube-api-access-bpf7t\") pod \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.279971 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-config\") pod \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\" (UID: \"bf812fb4-4e10-42ae-bc52-bc08b8749d29\") " Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.303875 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf812fb4-4e10-42ae-bc52-bc08b8749d29-kube-api-access-bpf7t" (OuterVolumeSpecName: "kube-api-access-bpf7t") pod "bf812fb4-4e10-42ae-bc52-bc08b8749d29" (UID: "bf812fb4-4e10-42ae-bc52-bc08b8749d29"). InnerVolumeSpecName "kube-api-access-bpf7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.322827 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bf812fb4-4e10-42ae-bc52-bc08b8749d29" (UID: "bf812fb4-4e10-42ae-bc52-bc08b8749d29"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.377458 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf812fb4-4e10-42ae-bc52-bc08b8749d29" (UID: "bf812fb4-4e10-42ae-bc52-bc08b8749d29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.385500 4708 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.385533 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.385545 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpf7t\" (UniqueName: \"kubernetes.io/projected/bf812fb4-4e10-42ae-bc52-bc08b8749d29-kube-api-access-bpf7t\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.444821 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 16:22:33 crc kubenswrapper[4708]: E0320 16:22:33.445802 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf812fb4-4e10-42ae-bc52-bc08b8749d29" containerName="neutron-httpd" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.445822 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf812fb4-4e10-42ae-bc52-bc08b8749d29" containerName="neutron-httpd" Mar 20 16:22:33 crc kubenswrapper[4708]: E0320 16:22:33.445864 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf812fb4-4e10-42ae-bc52-bc08b8749d29" containerName="neutron-api" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.445871 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf812fb4-4e10-42ae-bc52-bc08b8749d29" containerName="neutron-api" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.446267 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf812fb4-4e10-42ae-bc52-bc08b8749d29" containerName="neutron-httpd" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.446320 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf812fb4-4e10-42ae-bc52-bc08b8749d29" containerName="neutron-api" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.447280 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.450224 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-j6729" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.451377 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.451725 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.486140 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.489583 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zn64\" (UniqueName: \"kubernetes.io/projected/fee1204d-4f6f-4c1c-bb48-74b59703672a-kube-api-access-8zn64\") pod \"openstackclient\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.489685 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fee1204d-4f6f-4c1c-bb48-74b59703672a-openstack-config-secret\") pod \"openstackclient\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.489773 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee1204d-4f6f-4c1c-bb48-74b59703672a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.489864 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fee1204d-4f6f-4c1c-bb48-74b59703672a-openstack-config\") pod \"openstackclient\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.490886 4708 generic.go:334] "Generic (PLEG): container finished" podID="bf812fb4-4e10-42ae-bc52-bc08b8749d29" containerID="d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad" exitCode=0 Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.490975 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68db5b9d4d-q2n5k" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.491051 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68db5b9d4d-q2n5k" event={"ID":"bf812fb4-4e10-42ae-bc52-bc08b8749d29","Type":"ContainerDied","Data":"d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad"} Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.491091 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68db5b9d4d-q2n5k" event={"ID":"bf812fb4-4e10-42ae-bc52-bc08b8749d29","Type":"ContainerDied","Data":"39c308e50ed77143f2ef7df37ff841a983f0e60ff2d80e4a626d1017f21b878f"} Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.491132 4708 scope.go:117] "RemoveContainer" containerID="fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.494399 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-config" (OuterVolumeSpecName: "config") pod "bf812fb4-4e10-42ae-bc52-bc08b8749d29" (UID: "bf812fb4-4e10-42ae-bc52-bc08b8749d29"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.497489 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c5c890eb-8583-4d2f-bf71-55e9f83b51d6","Type":"ContainerStarted","Data":"5021a2de17612372ff1e9b319d52e2689ef65db7a4c3ba4c587220763936202a"} Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.504373 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" podUID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:41056->10.217.0.165:9311: read: connection reset by peer" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.504430 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" podUID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:41060->10.217.0.165:9311: read: connection reset by peer" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.522806 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bf812fb4-4e10-42ae-bc52-bc08b8749d29" (UID: "bf812fb4-4e10-42ae-bc52-bc08b8749d29"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.557057 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.5570328829999998 podStartE2EDuration="3.557032883s" podCreationTimestamp="2026-03-20 16:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:33.525041457 +0000 UTC m=+1308.199378172" watchObservedRunningTime="2026-03-20 16:22:33.557032883 +0000 UTC m=+1308.231369588" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.592565 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zn64\" (UniqueName: \"kubernetes.io/projected/fee1204d-4f6f-4c1c-bb48-74b59703672a-kube-api-access-8zn64\") pod \"openstackclient\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.592704 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fee1204d-4f6f-4c1c-bb48-74b59703672a-openstack-config-secret\") pod \"openstackclient\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.592832 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee1204d-4f6f-4c1c-bb48-74b59703672a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.592870 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fee1204d-4f6f-4c1c-bb48-74b59703672a-openstack-config\") pod \"openstackclient\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.593072 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.595217 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fee1204d-4f6f-4c1c-bb48-74b59703672a-openstack-config\") pod \"openstackclient\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.593089 4708 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf812fb4-4e10-42ae-bc52-bc08b8749d29-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.599244 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fee1204d-4f6f-4c1c-bb48-74b59703672a-openstack-config-secret\") pod \"openstackclient\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.601024 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee1204d-4f6f-4c1c-bb48-74b59703672a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.613274 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zn64\" (UniqueName: \"kubernetes.io/projected/fee1204d-4f6f-4c1c-bb48-74b59703672a-kube-api-access-8zn64\") pod \"openstackclient\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.756828 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.816008 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.819163 4708 scope.go:117] "RemoveContainer" containerID="d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.873108 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.877998 4708 scope.go:117] "RemoveContainer" containerID="fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c" Mar 20 16:22:33 crc kubenswrapper[4708]: E0320 16:22:33.878424 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c\": container with ID starting with fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c not found: ID does not exist" containerID="fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.878450 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c"} err="failed to get container status \"fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c\": rpc error: code = NotFound desc = could not find container \"fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c\": container with ID starting with fd34a6958534b681b7407f29bfe21b7515d86a2e6efe20e448fb36dc809d0f7c not found: ID does not exist" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.878471 4708 scope.go:117] "RemoveContainer" containerID="d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad" Mar 20 16:22:33 crc kubenswrapper[4708]: E0320 16:22:33.878640 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad\": container with ID starting with d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad not found: ID does not exist" containerID="d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.878657 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad"} err="failed to get container status \"d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad\": rpc error: code = NotFound desc = could not find container \"d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad\": container with ID starting with d83d9553ec12482c2d00bed4fac5e8202323113f8f14f182730c52b0478dcbad not found: ID does not exist" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.881603 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.882741 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.893643 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-68db5b9d4d-q2n5k"] Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.901075 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/08cb3912-b1b4-40bb-a815-7c7ca540f327-openstack-config-secret\") pod \"openstackclient\" (UID: \"08cb3912-b1b4-40bb-a815-7c7ca540f327\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.901120 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08cb3912-b1b4-40bb-a815-7c7ca540f327-combined-ca-bundle\") pod \"openstackclient\" (UID: \"08cb3912-b1b4-40bb-a815-7c7ca540f327\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.901420 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5br9\" (UniqueName: \"kubernetes.io/projected/08cb3912-b1b4-40bb-a815-7c7ca540f327-kube-api-access-f5br9\") pod \"openstackclient\" (UID: \"08cb3912-b1b4-40bb-a815-7c7ca540f327\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.901645 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/08cb3912-b1b4-40bb-a815-7c7ca540f327-openstack-config\") pod \"openstackclient\" (UID: \"08cb3912-b1b4-40bb-a815-7c7ca540f327\") " pod="openstack/openstackclient" Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.907926 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 16:22:33 crc kubenswrapper[4708]: I0320 16:22:33.917547 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-68db5b9d4d-q2n5k"] Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.009376 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/08cb3912-b1b4-40bb-a815-7c7ca540f327-openstack-config-secret\") pod \"openstackclient\" (UID: \"08cb3912-b1b4-40bb-a815-7c7ca540f327\") " pod="openstack/openstackclient" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.009823 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08cb3912-b1b4-40bb-a815-7c7ca540f327-combined-ca-bundle\") pod \"openstackclient\" (UID: \"08cb3912-b1b4-40bb-a815-7c7ca540f327\") " pod="openstack/openstackclient" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.010093 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5br9\" (UniqueName: \"kubernetes.io/projected/08cb3912-b1b4-40bb-a815-7c7ca540f327-kube-api-access-f5br9\") pod \"openstackclient\" (UID: \"08cb3912-b1b4-40bb-a815-7c7ca540f327\") " pod="openstack/openstackclient" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.010172 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/08cb3912-b1b4-40bb-a815-7c7ca540f327-openstack-config\") pod \"openstackclient\" (UID: \"08cb3912-b1b4-40bb-a815-7c7ca540f327\") " pod="openstack/openstackclient" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.010975 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/08cb3912-b1b4-40bb-a815-7c7ca540f327-openstack-config\") pod \"openstackclient\" (UID: \"08cb3912-b1b4-40bb-a815-7c7ca540f327\") " pod="openstack/openstackclient" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.017120 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.028267 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/08cb3912-b1b4-40bb-a815-7c7ca540f327-openstack-config-secret\") pod \"openstackclient\" (UID: \"08cb3912-b1b4-40bb-a815-7c7ca540f327\") " pod="openstack/openstackclient" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.028374 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5br9\" (UniqueName: \"kubernetes.io/projected/08cb3912-b1b4-40bb-a815-7c7ca540f327-kube-api-access-f5br9\") pod \"openstackclient\" (UID: \"08cb3912-b1b4-40bb-a815-7c7ca540f327\") " pod="openstack/openstackclient" Mar 20 16:22:34 crc kubenswrapper[4708]: E0320 16:22:34.030687 4708 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 16:22:34 crc kubenswrapper[4708]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_fee1204d-4f6f-4c1c-bb48-74b59703672a_0(aa4cee956c96a0b503be04da44d2b0ba0efde30dc3b796a0016a2da84e78f809): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"aa4cee956c96a0b503be04da44d2b0ba0efde30dc3b796a0016a2da84e78f809" Netns:"/var/run/netns/47ea4e15-8f87-4388-89e3-37aa1254312a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=aa4cee956c96a0b503be04da44d2b0ba0efde30dc3b796a0016a2da84e78f809;K8S_POD_UID=fee1204d-4f6f-4c1c-bb48-74b59703672a" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/fee1204d-4f6f-4c1c-bb48-74b59703672a]: expected pod UID "fee1204d-4f6f-4c1c-bb48-74b59703672a" but got "08cb3912-b1b4-40bb-a815-7c7ca540f327" from Kube API Mar 20 16:22:34 crc kubenswrapper[4708]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 16:22:34 crc kubenswrapper[4708]: > Mar 20 16:22:34 crc kubenswrapper[4708]: E0320 16:22:34.030775 4708 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 16:22:34 crc kubenswrapper[4708]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_fee1204d-4f6f-4c1c-bb48-74b59703672a_0(aa4cee956c96a0b503be04da44d2b0ba0efde30dc3b796a0016a2da84e78f809): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"aa4cee956c96a0b503be04da44d2b0ba0efde30dc3b796a0016a2da84e78f809" Netns:"/var/run/netns/47ea4e15-8f87-4388-89e3-37aa1254312a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=aa4cee956c96a0b503be04da44d2b0ba0efde30dc3b796a0016a2da84e78f809;K8S_POD_UID=fee1204d-4f6f-4c1c-bb48-74b59703672a" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/fee1204d-4f6f-4c1c-bb48-74b59703672a]: expected pod UID "fee1204d-4f6f-4c1c-bb48-74b59703672a" but got "08cb3912-b1b4-40bb-a815-7c7ca540f327" from Kube API Mar 20 16:22:34 crc kubenswrapper[4708]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 16:22:34 crc kubenswrapper[4708]: > pod="openstack/openstackclient" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.031555 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08cb3912-b1b4-40bb-a815-7c7ca540f327-combined-ca-bundle\") pod \"openstackclient\" (UID: \"08cb3912-b1b4-40bb-a815-7c7ca540f327\") " pod="openstack/openstackclient" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.114586 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-config-data\") pod \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.114707 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7ab775f-a2d7-43de-9f9d-356a6a86f930-logs\") pod \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.114797 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-config-data-custom\") pod \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.114832 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-combined-ca-bundle\") pod \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.114911 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq58f\" (UniqueName: \"kubernetes.io/projected/f7ab775f-a2d7-43de-9f9d-356a6a86f930-kube-api-access-mq58f\") pod \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\" (UID: \"f7ab775f-a2d7-43de-9f9d-356a6a86f930\") " Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.116270 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ab775f-a2d7-43de-9f9d-356a6a86f930-logs" (OuterVolumeSpecName: "logs") pod "f7ab775f-a2d7-43de-9f9d-356a6a86f930" (UID: "f7ab775f-a2d7-43de-9f9d-356a6a86f930"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.120462 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f7ab775f-a2d7-43de-9f9d-356a6a86f930" (UID: "f7ab775f-a2d7-43de-9f9d-356a6a86f930"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.122724 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ab775f-a2d7-43de-9f9d-356a6a86f930-kube-api-access-mq58f" (OuterVolumeSpecName: "kube-api-access-mq58f") pod "f7ab775f-a2d7-43de-9f9d-356a6a86f930" (UID: "f7ab775f-a2d7-43de-9f9d-356a6a86f930"). InnerVolumeSpecName "kube-api-access-mq58f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.149602 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf812fb4-4e10-42ae-bc52-bc08b8749d29" path="/var/lib/kubelet/pods/bf812fb4-4e10-42ae-bc52-bc08b8749d29/volumes" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.165477 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7ab775f-a2d7-43de-9f9d-356a6a86f930" (UID: "f7ab775f-a2d7-43de-9f9d-356a6a86f930"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.187838 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-config-data" (OuterVolumeSpecName: "config-data") pod "f7ab775f-a2d7-43de-9f9d-356a6a86f930" (UID: "f7ab775f-a2d7-43de-9f9d-356a6a86f930"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.200513 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.217306 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7ab775f-a2d7-43de-9f9d-356a6a86f930-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.217342 4708 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.217357 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.217366 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq58f\" (UniqueName: \"kubernetes.io/projected/f7ab775f-a2d7-43de-9f9d-356a6a86f930-kube-api-access-mq58f\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.217376 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab775f-a2d7-43de-9f9d-356a6a86f930-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.526006 4708 generic.go:334] "Generic (PLEG): container finished" podID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" containerID="887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79" exitCode=0 Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.526197 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.526214 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" event={"ID":"f7ab775f-a2d7-43de-9f9d-356a6a86f930","Type":"ContainerDied","Data":"887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79"} Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.526961 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cdb79bd7b-v5pg5" event={"ID":"f7ab775f-a2d7-43de-9f9d-356a6a86f930","Type":"ContainerDied","Data":"07ef991ff2fd7f224c7da06a02ff1f7c5c8fa07d5c1afa20bcd457198707488d"} Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.526980 4708 scope.go:117] "RemoveContainer" containerID="887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.540578 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.605694 4708 scope.go:117] "RemoveContainer" containerID="a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.610364 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.617534 4708 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fee1204d-4f6f-4c1c-bb48-74b59703672a" podUID="08cb3912-b1b4-40bb-a815-7c7ca540f327" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.629466 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6cdb79bd7b-v5pg5"] Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.634747 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fee1204d-4f6f-4c1c-bb48-74b59703672a-openstack-config\") pod \"fee1204d-4f6f-4c1c-bb48-74b59703672a\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.636357 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee1204d-4f6f-4c1c-bb48-74b59703672a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fee1204d-4f6f-4c1c-bb48-74b59703672a" (UID: "fee1204d-4f6f-4c1c-bb48-74b59703672a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.638365 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6cdb79bd7b-v5pg5"] Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.652447 4708 scope.go:117] "RemoveContainer" containerID="887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79" Mar 20 16:22:34 crc kubenswrapper[4708]: E0320 16:22:34.653172 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79\": container with ID starting with 887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79 not found: ID does not exist" containerID="887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.653200 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79"} err="failed to get container status \"887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79\": rpc error: code = NotFound desc = could not find container \"887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79\": container with ID starting with 887eff954bb05170c74203188dd7bcdf610503a5caf7d60896513e2a5a6ddb79 not found: ID does not exist" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.653229 4708 scope.go:117] "RemoveContainer" containerID="a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63" Mar 20 16:22:34 crc kubenswrapper[4708]: E0320 16:22:34.653416 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63\": container with ID starting with a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63 not found: ID does not exist" containerID="a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.653434 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63"} err="failed to get container status \"a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63\": rpc error: code = NotFound desc = could not find container \"a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63\": container with ID starting with a9c6e36c71e80b0094d86f0d15fae7e68e1792a8d0616ee3f4d1ade6119a9a63 not found: ID does not exist" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.736628 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fee1204d-4f6f-4c1c-bb48-74b59703672a-openstack-config-secret\") pod \"fee1204d-4f6f-4c1c-bb48-74b59703672a\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.739433 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee1204d-4f6f-4c1c-bb48-74b59703672a-combined-ca-bundle\") pod \"fee1204d-4f6f-4c1c-bb48-74b59703672a\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.739695 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zn64\" (UniqueName: \"kubernetes.io/projected/fee1204d-4f6f-4c1c-bb48-74b59703672a-kube-api-access-8zn64\") pod \"fee1204d-4f6f-4c1c-bb48-74b59703672a\" (UID: \"fee1204d-4f6f-4c1c-bb48-74b59703672a\") " Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.740824 4708 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fee1204d-4f6f-4c1c-bb48-74b59703672a-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.746147 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee1204d-4f6f-4c1c-bb48-74b59703672a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fee1204d-4f6f-4c1c-bb48-74b59703672a" (UID: "fee1204d-4f6f-4c1c-bb48-74b59703672a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.752499 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee1204d-4f6f-4c1c-bb48-74b59703672a-kube-api-access-8zn64" (OuterVolumeSpecName: "kube-api-access-8zn64") pod "fee1204d-4f6f-4c1c-bb48-74b59703672a" (UID: "fee1204d-4f6f-4c1c-bb48-74b59703672a"). InnerVolumeSpecName "kube-api-access-8zn64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.752533 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee1204d-4f6f-4c1c-bb48-74b59703672a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fee1204d-4f6f-4c1c-bb48-74b59703672a" (UID: "fee1204d-4f6f-4c1c-bb48-74b59703672a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.758183 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.842184 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zn64\" (UniqueName: \"kubernetes.io/projected/fee1204d-4f6f-4c1c-bb48-74b59703672a-kube-api-access-8zn64\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.842212 4708 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fee1204d-4f6f-4c1c-bb48-74b59703672a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:34 crc kubenswrapper[4708]: I0320 16:22:34.842222 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee1204d-4f6f-4c1c-bb48-74b59703672a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:35 crc kubenswrapper[4708]: I0320 16:22:35.554188 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 16:22:35 crc kubenswrapper[4708]: I0320 16:22:35.554372 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"08cb3912-b1b4-40bb-a815-7c7ca540f327","Type":"ContainerStarted","Data":"a690aa8dbca235d8df292f473d2d782ca7036b208f67f05f9ff4bbad272b640e"} Mar 20 16:22:35 crc kubenswrapper[4708]: I0320 16:22:35.580627 4708 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fee1204d-4f6f-4c1c-bb48-74b59703672a" podUID="08cb3912-b1b4-40bb-a815-7c7ca540f327" Mar 20 16:22:35 crc kubenswrapper[4708]: I0320 16:22:35.855250 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.126596 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" path="/var/lib/kubelet/pods/f7ab775f-a2d7-43de-9f9d-356a6a86f930/volumes" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.127828 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee1204d-4f6f-4c1c-bb48-74b59703672a" path="/var/lib/kubelet/pods/fee1204d-4f6f-4c1c-bb48-74b59703672a/volumes" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.873900 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6f48f97b7c-qw6zw"] Mar 20 16:22:36 crc kubenswrapper[4708]: E0320 16:22:36.874617 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" containerName="barbican-api" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.874632 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" containerName="barbican-api" Mar 20 16:22:36 crc kubenswrapper[4708]: E0320 16:22:36.874685 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" containerName="barbican-api-log" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.874693 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" containerName="barbican-api-log" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.874889 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" containerName="barbican-api-log" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.874932 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ab775f-a2d7-43de-9f9d-356a6a86f930" containerName="barbican-api" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.876135 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.880776 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.881002 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.881135 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.906643 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f48f97b7c-qw6zw"] Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.986388 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/519360dd-4258-4c54-a239-55283b46ffb3-combined-ca-bundle\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.986446 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/519360dd-4258-4c54-a239-55283b46ffb3-etc-swift\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.986481 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/519360dd-4258-4c54-a239-55283b46ffb3-run-httpd\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.986544 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrfm\" (UniqueName: \"kubernetes.io/projected/519360dd-4258-4c54-a239-55283b46ffb3-kube-api-access-jkrfm\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.986948 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/519360dd-4258-4c54-a239-55283b46ffb3-log-httpd\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.987015 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/519360dd-4258-4c54-a239-55283b46ffb3-config-data\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.987440 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/519360dd-4258-4c54-a239-55283b46ffb3-internal-tls-certs\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:36 crc kubenswrapper[4708]: I0320 16:22:36.987525 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/519360dd-4258-4c54-a239-55283b46ffb3-public-tls-certs\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.084185 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.087046 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="ceilometer-notification-agent" containerID="cri-o://e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51" gracePeriod=30 Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.087029 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="ceilometer-central-agent" containerID="cri-o://59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875" gracePeriod=30 Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.087168 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="proxy-httpd" containerID="cri-o://456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea" gracePeriod=30 Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.087244 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="sg-core" containerID="cri-o://bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd" gracePeriod=30 Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.090688 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/519360dd-4258-4c54-a239-55283b46ffb3-log-httpd\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.090733 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/519360dd-4258-4c54-a239-55283b46ffb3-config-data\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.090805 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/519360dd-4258-4c54-a239-55283b46ffb3-internal-tls-certs\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.090828 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/519360dd-4258-4c54-a239-55283b46ffb3-public-tls-certs\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.090880 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/519360dd-4258-4c54-a239-55283b46ffb3-combined-ca-bundle\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.090906 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/519360dd-4258-4c54-a239-55283b46ffb3-etc-swift\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.090935 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/519360dd-4258-4c54-a239-55283b46ffb3-run-httpd\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.090976 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrfm\" (UniqueName: \"kubernetes.io/projected/519360dd-4258-4c54-a239-55283b46ffb3-kube-api-access-jkrfm\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.091828 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/519360dd-4258-4c54-a239-55283b46ffb3-log-httpd\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.096114 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/519360dd-4258-4c54-a239-55283b46ffb3-run-httpd\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.097950 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.101183 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/519360dd-4258-4c54-a239-55283b46ffb3-internal-tls-certs\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.102552 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/519360dd-4258-4c54-a239-55283b46ffb3-combined-ca-bundle\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.107656 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/519360dd-4258-4c54-a239-55283b46ffb3-etc-swift\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.108409 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/519360dd-4258-4c54-a239-55283b46ffb3-config-data\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.115945 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrfm\" (UniqueName: \"kubernetes.io/projected/519360dd-4258-4c54-a239-55283b46ffb3-kube-api-access-jkrfm\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.130839 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/519360dd-4258-4c54-a239-55283b46ffb3-public-tls-certs\") pod \"swift-proxy-6f48f97b7c-qw6zw\" (UID: \"519360dd-4258-4c54-a239-55283b46ffb3\") " pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.198951 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.587089 4708 generic.go:334] "Generic (PLEG): container finished" podID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerID="456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea" exitCode=0 Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.587416 4708 generic.go:334] "Generic (PLEG): container finished" podID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerID="bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd" exitCode=2 Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.587425 4708 generic.go:334] "Generic (PLEG): container finished" podID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerID="59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875" exitCode=0 Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.587161 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3","Type":"ContainerDied","Data":"456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea"} Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.587463 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3","Type":"ContainerDied","Data":"bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd"} Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.587477 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3","Type":"ContainerDied","Data":"59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875"} Mar 20 16:22:37 crc kubenswrapper[4708]: I0320 16:22:37.864100 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f48f97b7c-qw6zw"] Mar 20 16:22:37 crc kubenswrapper[4708]: W0320 16:22:37.879387 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod519360dd_4258_4c54_a239_55283b46ffb3.slice/crio-a9f23974f1d1ccc41fe70850522f359bd1fdb7de650112d37d650b549dc8eaba WatchSource:0}: Error finding container a9f23974f1d1ccc41fe70850522f359bd1fdb7de650112d37d650b549dc8eaba: Status 404 returned error can't find the container with id a9f23974f1d1ccc41fe70850522f359bd1fdb7de650112d37d650b549dc8eaba Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.307745 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.420401 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq66q\" (UniqueName: \"kubernetes.io/projected/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-kube-api-access-qq66q\") pod \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.420563 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-scripts\") pod \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.420731 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-log-httpd\") pod \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.420809 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-config-data\") pod \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.420989 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-run-httpd\") pod \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.421048 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-combined-ca-bundle\") pod \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.421100 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-sg-core-conf-yaml\") pod \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\" (UID: \"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3\") " Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.422862 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" (UID: "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.423243 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" (UID: "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.426304 4708 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.426355 4708 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.428098 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-scripts" (OuterVolumeSpecName: "scripts") pod "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" (UID: "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.440171 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-kube-api-access-qq66q" (OuterVolumeSpecName: "kube-api-access-qq66q") pod "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" (UID: "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3"). InnerVolumeSpecName "kube-api-access-qq66q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.473946 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" (UID: "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.524425 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" (UID: "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.532019 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.532366 4708 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.532447 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq66q\" (UniqueName: \"kubernetes.io/projected/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-kube-api-access-qq66q\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.532514 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.552783 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-config-data" (OuterVolumeSpecName: "config-data") pod "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" (UID: "5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.608109 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f48f97b7c-qw6zw" event={"ID":"519360dd-4258-4c54-a239-55283b46ffb3","Type":"ContainerStarted","Data":"5efc00ddd71a6d2f12e4399d6b78cdbdc4f80f5e87190e890ae15acd89a4fb49"} Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.608483 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.608587 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f48f97b7c-qw6zw" event={"ID":"519360dd-4258-4c54-a239-55283b46ffb3","Type":"ContainerStarted","Data":"a53d0a8a54747fd24bd097b203ca918e673048f6fa774bf49f9204d07437e93a"} Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.608651 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f48f97b7c-qw6zw" event={"ID":"519360dd-4258-4c54-a239-55283b46ffb3","Type":"ContainerStarted","Data":"a9f23974f1d1ccc41fe70850522f359bd1fdb7de650112d37d650b549dc8eaba"} Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.608754 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.615584 4708 generic.go:334] "Generic (PLEG): container finished" podID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerID="e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51" exitCode=0 Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.615931 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.618793 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3","Type":"ContainerDied","Data":"e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51"} Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.618865 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3","Type":"ContainerDied","Data":"3ac7bd427a72918b86e3d0a88283e17fe850b07e8d5f0a4b5dce4cb0850a458d"} Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.618888 4708 scope.go:117] "RemoveContainer" containerID="456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.635945 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6f48f97b7c-qw6zw" podStartSLOduration=2.635917774 podStartE2EDuration="2.635917774s" podCreationTimestamp="2026-03-20 16:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:38.633736164 +0000 UTC m=+1313.308072879" watchObservedRunningTime="2026-03-20 16:22:38.635917774 +0000 UTC m=+1313.310254489" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.636587 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.681975 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.697154 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.701751 4708 scope.go:117] "RemoveContainer" containerID="bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.712826 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:38 crc kubenswrapper[4708]: E0320 16:22:38.713350 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="sg-core" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.713373 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="sg-core" Mar 20 16:22:38 crc kubenswrapper[4708]: E0320 16:22:38.713392 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="ceilometer-central-agent" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.713399 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="ceilometer-central-agent" Mar 20 16:22:38 crc kubenswrapper[4708]: E0320 16:22:38.713408 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="ceilometer-notification-agent" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.713416 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="ceilometer-notification-agent" Mar 20 16:22:38 crc kubenswrapper[4708]: E0320 16:22:38.713435 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="proxy-httpd" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.713442 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="proxy-httpd" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.713697 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="ceilometer-notification-agent" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.713725 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="proxy-httpd" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.713740 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="sg-core" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.713755 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" containerName="ceilometer-central-agent" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.715817 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.723851 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.725082 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.726183 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.729704 4708 scope.go:117] "RemoveContainer" containerID="e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.777322 4708 scope.go:117] "RemoveContainer" containerID="59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.798623 4708 scope.go:117] "RemoveContainer" containerID="456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea" Mar 20 16:22:38 crc kubenswrapper[4708]: E0320 16:22:38.799145 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea\": container with ID starting with 456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea not found: ID does not exist" containerID="456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.799183 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea"} err="failed to get container status \"456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea\": rpc error: code = NotFound desc = could not find container \"456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea\": container with ID starting with 456a009b73bed1c47507a5ab5b198bc6ed0af6612a753dee51f4e8c2546970ea not found: ID does not exist" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.799204 4708 scope.go:117] "RemoveContainer" containerID="bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd" Mar 20 16:22:38 crc kubenswrapper[4708]: E0320 16:22:38.800757 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd\": container with ID starting with bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd not found: ID does not exist" containerID="bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.800788 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd"} err="failed to get container status \"bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd\": rpc error: code = NotFound desc = could not find container \"bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd\": container with ID starting with bf3eaa830a8845359c4295405f2b2c2788a901651c743c3b93efaca67b54b7cd not found: ID does not exist" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.800837 4708 scope.go:117] "RemoveContainer" containerID="e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51" Mar 20 16:22:38 crc kubenswrapper[4708]: E0320 16:22:38.801836 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51\": container with ID starting with e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51 not found: ID does not exist" containerID="e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.801862 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51"} err="failed to get container status \"e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51\": rpc error: code = NotFound desc = could not find container \"e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51\": container with ID starting with e77d49bb03ae9df71dcb4fb92f2ffb6eb84866dae498527ba38f87dcc15b4c51 not found: ID does not exist" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.801879 4708 scope.go:117] "RemoveContainer" containerID="59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875" Mar 20 16:22:38 crc kubenswrapper[4708]: E0320 16:22:38.803133 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875\": container with ID starting with 59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875 not found: ID does not exist" containerID="59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.803169 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875"} err="failed to get container status \"59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875\": rpc error: code = NotFound desc = could not find container \"59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875\": container with ID starting with 59c80885a26b83b67b306b3d944184f4da67ce57c0cb7367e59392481b664875 not found: ID does not exist" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.840970 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20739bf7-b966-4dbd-8846-4bda838c5da4-run-httpd\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.841041 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-scripts\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.841075 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20739bf7-b966-4dbd-8846-4bda838c5da4-log-httpd\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.841102 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-config-data\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.841168 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.841189 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.841243 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qczlv\" (UniqueName: \"kubernetes.io/projected/20739bf7-b966-4dbd-8846-4bda838c5da4-kube-api-access-qczlv\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.942776 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-scripts\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.942846 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20739bf7-b966-4dbd-8846-4bda838c5da4-log-httpd\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.942879 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-config-data\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.942948 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.942974 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.943029 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qczlv\" (UniqueName: \"kubernetes.io/projected/20739bf7-b966-4dbd-8846-4bda838c5da4-kube-api-access-qczlv\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.943055 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20739bf7-b966-4dbd-8846-4bda838c5da4-run-httpd\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.943969 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20739bf7-b966-4dbd-8846-4bda838c5da4-log-httpd\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.944576 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20739bf7-b966-4dbd-8846-4bda838c5da4-run-httpd\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.948242 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-config-data\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.948531 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.950120 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-scripts\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.950297 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:38 crc kubenswrapper[4708]: I0320 16:22:38.963130 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qczlv\" (UniqueName: \"kubernetes.io/projected/20739bf7-b966-4dbd-8846-4bda838c5da4-kube-api-access-qczlv\") pod \"ceilometer-0\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " pod="openstack/ceilometer-0" Mar 20 16:22:39 crc kubenswrapper[4708]: I0320 16:22:39.037788 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:39 crc kubenswrapper[4708]: I0320 16:22:39.539508 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:39 crc kubenswrapper[4708]: W0320 16:22:39.555826 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20739bf7_b966_4dbd_8846_4bda838c5da4.slice/crio-11e20719508eb00580072cc1797cc1f6e9e891987d981e6753addb9045972c97 WatchSource:0}: Error finding container 11e20719508eb00580072cc1797cc1f6e9e891987d981e6753addb9045972c97: Status 404 returned error can't find the container with id 11e20719508eb00580072cc1797cc1f6e9e891987d981e6753addb9045972c97 Mar 20 16:22:39 crc kubenswrapper[4708]: I0320 16:22:39.636493 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20739bf7-b966-4dbd-8846-4bda838c5da4","Type":"ContainerStarted","Data":"11e20719508eb00580072cc1797cc1f6e9e891987d981e6753addb9045972c97"} Mar 20 16:22:40 crc kubenswrapper[4708]: I0320 16:22:40.124935 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3" path="/var/lib/kubelet/pods/5a1b1ae4-dad2-438b-81b1-a1ee3d7d9ee3/volumes" Mar 20 16:22:41 crc kubenswrapper[4708]: I0320 16:22:41.119205 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 16:22:42 crc kubenswrapper[4708]: I0320 16:22:42.766125 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6b6ff5cbbd-kjfxp" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 20 16:22:45 crc kubenswrapper[4708]: I0320 16:22:45.708542 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20739bf7-b966-4dbd-8846-4bda838c5da4","Type":"ContainerStarted","Data":"b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92"} Mar 20 16:22:45 crc kubenswrapper[4708]: I0320 16:22:45.710812 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"08cb3912-b1b4-40bb-a815-7c7ca540f327","Type":"ContainerStarted","Data":"2b925cd676bb883e67d3c372d7a3cccec4d77f66db3ee5e3ae27507fe0d58ef3"} Mar 20 16:22:45 crc kubenswrapper[4708]: I0320 16:22:45.739660 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.304464331 podStartE2EDuration="12.739643607s" podCreationTimestamp="2026-03-20 16:22:33 +0000 UTC" firstStartedPulling="2026-03-20 16:22:34.764217981 +0000 UTC m=+1309.438554696" lastFinishedPulling="2026-03-20 16:22:45.199397257 +0000 UTC m=+1319.873733972" observedRunningTime="2026-03-20 16:22:45.731659159 +0000 UTC m=+1320.405995864" watchObservedRunningTime="2026-03-20 16:22:45.739643607 +0000 UTC m=+1320.413980322" Mar 20 16:22:46 crc kubenswrapper[4708]: I0320 16:22:46.557832 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:46 crc kubenswrapper[4708]: I0320 16:22:46.558680 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ac8c72be-4723-43be-8537-072adeb1e924" containerName="glance-log" containerID="cri-o://81ccf63c44b61beb53e56f9b9bbb79ac7b584bc9050d2b8ea6e95ac0d0995df8" gracePeriod=30 Mar 20 16:22:46 crc kubenswrapper[4708]: I0320 16:22:46.558797 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ac8c72be-4723-43be-8537-072adeb1e924" containerName="glance-httpd" containerID="cri-o://e2c3b178ed0e541d253f458a7688daacddfaec3e1b2fcd36a923a2e80a254119" gracePeriod=30 Mar 20 16:22:46 crc kubenswrapper[4708]: I0320 16:22:46.723694 4708 generic.go:334] "Generic (PLEG): container finished" podID="ac8c72be-4723-43be-8537-072adeb1e924" containerID="81ccf63c44b61beb53e56f9b9bbb79ac7b584bc9050d2b8ea6e95ac0d0995df8" exitCode=143 Mar 20 16:22:46 crc kubenswrapper[4708]: I0320 16:22:46.723757 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac8c72be-4723-43be-8537-072adeb1e924","Type":"ContainerDied","Data":"81ccf63c44b61beb53e56f9b9bbb79ac7b584bc9050d2b8ea6e95ac0d0995df8"} Mar 20 16:22:46 crc kubenswrapper[4708]: I0320 16:22:46.726990 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20739bf7-b966-4dbd-8846-4bda838c5da4","Type":"ContainerStarted","Data":"b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5"} Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.206967 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.210744 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f48f97b7c-qw6zw" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.530177 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.743522 4708 generic.go:334] "Generic (PLEG): container finished" podID="00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" containerID="4ba8669b02f07a77d6f20179a54d39ad8e036c49843f75706ab61fd4849e1aec" exitCode=137 Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.743879 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c","Type":"ContainerDied","Data":"4ba8669b02f07a77d6f20179a54d39ad8e036c49843f75706ab61fd4849e1aec"} Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.743928 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c","Type":"ContainerDied","Data":"ec5ba7cc3a0fd6d57bd86b709d22c437a0687105d28654f971ff0bb2f86dc0f8"} Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.743943 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec5ba7cc3a0fd6d57bd86b709d22c437a0687105d28654f971ff0bb2f86dc0f8" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.747609 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20739bf7-b966-4dbd-8846-4bda838c5da4","Type":"ContainerStarted","Data":"f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a"} Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.785077 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.835726 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-combined-ca-bundle\") pod \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.835904 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-config-data\") pod \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.836030 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-config-data-custom\") pod \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.836096 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkptn\" (UniqueName: \"kubernetes.io/projected/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-kube-api-access-pkptn\") pod \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.836133 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-etc-machine-id\") pod \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.836154 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-logs\") pod \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.836199 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-scripts\") pod \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\" (UID: \"00996e6f-a64c-4f6b-85ba-3c1ed4284b8c\") " Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.837163 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-logs" (OuterVolumeSpecName: "logs") pod "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" (UID: "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.837203 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" (UID: "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.839656 4708 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.839819 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.846853 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-scripts" (OuterVolumeSpecName: "scripts") pod "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" (UID: "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.847054 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-kube-api-access-pkptn" (OuterVolumeSpecName: "kube-api-access-pkptn") pod "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" (UID: "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c"). InnerVolumeSpecName "kube-api-access-pkptn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.877838 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" (UID: "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.893458 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" (UID: "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.941298 4708 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.941335 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkptn\" (UniqueName: \"kubernetes.io/projected/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-kube-api-access-pkptn\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.941348 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.941356 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:47 crc kubenswrapper[4708]: I0320 16:22:47.953888 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-config-data" (OuterVolumeSpecName: "config-data") pod "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" (UID: "00996e6f-a64c-4f6b-85ba-3c1ed4284b8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.043660 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.584476 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.585300 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0d2a3b0e-e329-43b5-937a-107b0bea8941" containerName="glance-log" containerID="cri-o://fe4ed75fcf01f154ccd411b7819fcfe28253b3198ad4de5b3b4d33a84578e814" gracePeriod=30 Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.585390 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0d2a3b0e-e329-43b5-937a-107b0bea8941" containerName="glance-httpd" containerID="cri-o://a966cb1ff98f8190c957d01cf1bd095dd553069e7a18c2b3287c6bb6a471aaaf" gracePeriod=30 Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.759427 4708 generic.go:334] "Generic (PLEG): container finished" podID="0d2a3b0e-e329-43b5-937a-107b0bea8941" containerID="fe4ed75fcf01f154ccd411b7819fcfe28253b3198ad4de5b3b4d33a84578e814" exitCode=143 Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.759491 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d2a3b0e-e329-43b5-937a-107b0bea8941","Type":"ContainerDied","Data":"fe4ed75fcf01f154ccd411b7819fcfe28253b3198ad4de5b3b4d33a84578e814"} Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.759562 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.782233 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.791261 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.817872 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:22:48 crc kubenswrapper[4708]: E0320 16:22:48.818406 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" containerName="cinder-api-log" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.818426 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" containerName="cinder-api-log" Mar 20 16:22:48 crc kubenswrapper[4708]: E0320 16:22:48.818442 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" containerName="cinder-api" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.818451 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" containerName="cinder-api" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.818695 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" containerName="cinder-api-log" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.818727 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" containerName="cinder-api" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.819933 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.823991 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.824179 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.824247 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.853829 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.858817 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-config-data-custom\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.859125 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5c84402-2274-4aa9-a456-4b936ba6b94e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.859300 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-config-data\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.859453 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.859609 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c84402-2274-4aa9-a456-4b936ba6b94e-logs\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.859755 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.859867 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-scripts\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.859960 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.860093 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjx2q\" (UniqueName: \"kubernetes.io/projected/b5c84402-2274-4aa9-a456-4b936ba6b94e-kube-api-access-xjx2q\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.961698 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-config-data\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.961765 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.961817 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c84402-2274-4aa9-a456-4b936ba6b94e-logs\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.961853 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.961913 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-scripts\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.962020 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.962169 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjx2q\" (UniqueName: \"kubernetes.io/projected/b5c84402-2274-4aa9-a456-4b936ba6b94e-kube-api-access-xjx2q\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.962378 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c84402-2274-4aa9-a456-4b936ba6b94e-logs\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.962871 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-config-data-custom\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.962935 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5c84402-2274-4aa9-a456-4b936ba6b94e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.963016 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b5c84402-2274-4aa9-a456-4b936ba6b94e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.967603 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-scripts\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.972656 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.972655 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.972689 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.973093 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-config-data\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:48 crc kubenswrapper[4708]: I0320 16:22:48.980246 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b5c84402-2274-4aa9-a456-4b936ba6b94e-config-data-custom\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.003143 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjx2q\" (UniqueName: \"kubernetes.io/projected/b5c84402-2274-4aa9-a456-4b936ba6b94e-kube-api-access-xjx2q\") pod \"cinder-api-0\" (UID: \"b5c84402-2274-4aa9-a456-4b936ba6b94e\") " pod="openstack/cinder-api-0" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.159198 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.420811 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-qc5ff"] Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.422717 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qc5ff" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.452527 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qc5ff"] Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.480090 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc-operator-scripts\") pod \"nova-api-db-create-qc5ff\" (UID: \"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc\") " pod="openstack/nova-api-db-create-qc5ff" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.480282 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52qh2\" (UniqueName: \"kubernetes.io/projected/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc-kube-api-access-52qh2\") pod \"nova-api-db-create-qc5ff\" (UID: \"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc\") " pod="openstack/nova-api-db-create-qc5ff" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.490100 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-2vzqh"] Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.491595 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2vzqh" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.497498 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2vzqh"] Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.607752 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f246f8c-2e08-400c-af52-746be688f708-operator-scripts\") pod \"nova-cell0-db-create-2vzqh\" (UID: \"3f246f8c-2e08-400c-af52-746be688f708\") " pod="openstack/nova-cell0-db-create-2vzqh" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.607974 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52qh2\" (UniqueName: \"kubernetes.io/projected/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc-kube-api-access-52qh2\") pod \"nova-api-db-create-qc5ff\" (UID: \"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc\") " pod="openstack/nova-api-db-create-qc5ff" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.613807 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc-operator-scripts\") pod \"nova-api-db-create-qc5ff\" (UID: \"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc\") " pod="openstack/nova-api-db-create-qc5ff" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.614003 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb6jc\" (UniqueName: \"kubernetes.io/projected/3f246f8c-2e08-400c-af52-746be688f708-kube-api-access-gb6jc\") pod \"nova-cell0-db-create-2vzqh\" (UID: \"3f246f8c-2e08-400c-af52-746be688f708\") " pod="openstack/nova-cell0-db-create-2vzqh" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.616412 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc-operator-scripts\") pod \"nova-api-db-create-qc5ff\" (UID: \"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc\") " pod="openstack/nova-api-db-create-qc5ff" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.618992 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0234-account-create-update-krdbf"] Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.622426 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0234-account-create-update-krdbf" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.624818 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.665793 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52qh2\" (UniqueName: \"kubernetes.io/projected/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc-kube-api-access-52qh2\") pod \"nova-api-db-create-qc5ff\" (UID: \"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc\") " pod="openstack/nova-api-db-create-qc5ff" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.671163 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0234-account-create-update-krdbf"] Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.717320 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7hzv\" (UniqueName: \"kubernetes.io/projected/394a21a5-81ce-4b43-8642-70a03a4a0685-kube-api-access-s7hzv\") pod \"nova-api-0234-account-create-update-krdbf\" (UID: \"394a21a5-81ce-4b43-8642-70a03a4a0685\") " pod="openstack/nova-api-0234-account-create-update-krdbf" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.717473 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/394a21a5-81ce-4b43-8642-70a03a4a0685-operator-scripts\") pod \"nova-api-0234-account-create-update-krdbf\" (UID: \"394a21a5-81ce-4b43-8642-70a03a4a0685\") " pod="openstack/nova-api-0234-account-create-update-krdbf" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.717648 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb6jc\" (UniqueName: \"kubernetes.io/projected/3f246f8c-2e08-400c-af52-746be688f708-kube-api-access-gb6jc\") pod \"nova-cell0-db-create-2vzqh\" (UID: \"3f246f8c-2e08-400c-af52-746be688f708\") " pod="openstack/nova-cell0-db-create-2vzqh" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.717768 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f246f8c-2e08-400c-af52-746be688f708-operator-scripts\") pod \"nova-cell0-db-create-2vzqh\" (UID: \"3f246f8c-2e08-400c-af52-746be688f708\") " pod="openstack/nova-cell0-db-create-2vzqh" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.718644 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f246f8c-2e08-400c-af52-746be688f708-operator-scripts\") pod \"nova-cell0-db-create-2vzqh\" (UID: \"3f246f8c-2e08-400c-af52-746be688f708\") " pod="openstack/nova-cell0-db-create-2vzqh" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.734551 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wwl4c"] Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.736189 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwl4c" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.743390 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb6jc\" (UniqueName: \"kubernetes.io/projected/3f246f8c-2e08-400c-af52-746be688f708-kube-api-access-gb6jc\") pod \"nova-cell0-db-create-2vzqh\" (UID: \"3f246f8c-2e08-400c-af52-746be688f708\") " pod="openstack/nova-cell0-db-create-2vzqh" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.757638 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wwl4c"] Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.777481 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20739bf7-b966-4dbd-8846-4bda838c5da4","Type":"ContainerStarted","Data":"815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524"} Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.777661 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="ceilometer-central-agent" containerID="cri-o://b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92" gracePeriod=30 Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.777957 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.778227 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="proxy-httpd" containerID="cri-o://815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524" gracePeriod=30 Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.778273 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="sg-core" containerID="cri-o://f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a" gracePeriod=30 Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.778307 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="ceilometer-notification-agent" containerID="cri-o://b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5" gracePeriod=30 Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.780554 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.785780 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b5c84402-2274-4aa9-a456-4b936ba6b94e","Type":"ContainerStarted","Data":"a4d95c6f0a531411fe7b96a16d07f634341acdb09dfc2719d7ae5dd9a37ca6bb"} Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.788998 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qc5ff" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.821079 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/394a21a5-81ce-4b43-8642-70a03a4a0685-operator-scripts\") pod \"nova-api-0234-account-create-update-krdbf\" (UID: \"394a21a5-81ce-4b43-8642-70a03a4a0685\") " pod="openstack/nova-api-0234-account-create-update-krdbf" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.824568 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630fc775-bda7-45ac-9852-650855479072-operator-scripts\") pod \"nova-cell1-db-create-wwl4c\" (UID: \"630fc775-bda7-45ac-9852-650855479072\") " pod="openstack/nova-cell1-db-create-wwl4c" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.825319 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/394a21a5-81ce-4b43-8642-70a03a4a0685-operator-scripts\") pod \"nova-api-0234-account-create-update-krdbf\" (UID: \"394a21a5-81ce-4b43-8642-70a03a4a0685\") " pod="openstack/nova-api-0234-account-create-update-krdbf" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.825579 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqdkm\" (UniqueName: \"kubernetes.io/projected/630fc775-bda7-45ac-9852-650855479072-kube-api-access-nqdkm\") pod \"nova-cell1-db-create-wwl4c\" (UID: \"630fc775-bda7-45ac-9852-650855479072\") " pod="openstack/nova-cell1-db-create-wwl4c" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.825871 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7hzv\" (UniqueName: \"kubernetes.io/projected/394a21a5-81ce-4b43-8642-70a03a4a0685-kube-api-access-s7hzv\") pod \"nova-api-0234-account-create-update-krdbf\" (UID: \"394a21a5-81ce-4b43-8642-70a03a4a0685\") " pod="openstack/nova-api-0234-account-create-update-krdbf" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.830573 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-f9fb-account-create-update-ngw6v"] Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.832388 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.836339 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2vzqh" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.839426 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.851628 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f9fb-account-create-update-ngw6v"] Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.854547 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7hzv\" (UniqueName: \"kubernetes.io/projected/394a21a5-81ce-4b43-8642-70a03a4a0685-kube-api-access-s7hzv\") pod \"nova-api-0234-account-create-update-krdbf\" (UID: \"394a21a5-81ce-4b43-8642-70a03a4a0685\") " pod="openstack/nova-api-0234-account-create-update-krdbf" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.862339 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.166928847 podStartE2EDuration="11.862316571s" podCreationTimestamp="2026-03-20 16:22:38 +0000 UTC" firstStartedPulling="2026-03-20 16:22:39.56395492 +0000 UTC m=+1314.238291635" lastFinishedPulling="2026-03-20 16:22:49.259342644 +0000 UTC m=+1323.933679359" observedRunningTime="2026-03-20 16:22:49.804993072 +0000 UTC m=+1324.479329787" watchObservedRunningTime="2026-03-20 16:22:49.862316571 +0000 UTC m=+1324.536653286" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.927495 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqdkm\" (UniqueName: \"kubernetes.io/projected/630fc775-bda7-45ac-9852-650855479072-kube-api-access-nqdkm\") pod \"nova-cell1-db-create-wwl4c\" (UID: \"630fc775-bda7-45ac-9852-650855479072\") " pod="openstack/nova-cell1-db-create-wwl4c" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.927630 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm5rg\" (UniqueName: \"kubernetes.io/projected/3116dd2a-d2d0-46cf-837d-56d29a7e116f-kube-api-access-mm5rg\") pod \"nova-cell0-f9fb-account-create-update-ngw6v\" (UID: \"3116dd2a-d2d0-46cf-837d-56d29a7e116f\") " pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.927762 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3116dd2a-d2d0-46cf-837d-56d29a7e116f-operator-scripts\") pod \"nova-cell0-f9fb-account-create-update-ngw6v\" (UID: \"3116dd2a-d2d0-46cf-837d-56d29a7e116f\") " pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.927800 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630fc775-bda7-45ac-9852-650855479072-operator-scripts\") pod \"nova-cell1-db-create-wwl4c\" (UID: \"630fc775-bda7-45ac-9852-650855479072\") " pod="openstack/nova-cell1-db-create-wwl4c" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.928686 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630fc775-bda7-45ac-9852-650855479072-operator-scripts\") pod \"nova-cell1-db-create-wwl4c\" (UID: \"630fc775-bda7-45ac-9852-650855479072\") " pod="openstack/nova-cell1-db-create-wwl4c" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.957591 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqdkm\" (UniqueName: \"kubernetes.io/projected/630fc775-bda7-45ac-9852-650855479072-kube-api-access-nqdkm\") pod \"nova-cell1-db-create-wwl4c\" (UID: \"630fc775-bda7-45ac-9852-650855479072\") " pod="openstack/nova-cell1-db-create-wwl4c" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.990014 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1d98-account-create-update-vvbxv"] Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.992193 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" Mar 20 16:22:49 crc kubenswrapper[4708]: I0320 16:22:49.995634 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:49.999285 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1d98-account-create-update-vvbxv"] Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.001805 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0234-account-create-update-krdbf" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.032655 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3116dd2a-d2d0-46cf-837d-56d29a7e116f-operator-scripts\") pod \"nova-cell0-f9fb-account-create-update-ngw6v\" (UID: \"3116dd2a-d2d0-46cf-837d-56d29a7e116f\") " pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.032820 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm5rg\" (UniqueName: \"kubernetes.io/projected/3116dd2a-d2d0-46cf-837d-56d29a7e116f-kube-api-access-mm5rg\") pod \"nova-cell0-f9fb-account-create-update-ngw6v\" (UID: \"3116dd2a-d2d0-46cf-837d-56d29a7e116f\") " pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.033828 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3116dd2a-d2d0-46cf-837d-56d29a7e116f-operator-scripts\") pod \"nova-cell0-f9fb-account-create-update-ngw6v\" (UID: \"3116dd2a-d2d0-46cf-837d-56d29a7e116f\") " pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.065184 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm5rg\" (UniqueName: \"kubernetes.io/projected/3116dd2a-d2d0-46cf-837d-56d29a7e116f-kube-api-access-mm5rg\") pod \"nova-cell0-f9fb-account-create-update-ngw6v\" (UID: \"3116dd2a-d2d0-46cf-837d-56d29a7e116f\") " pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.135056 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlnvd\" (UniqueName: \"kubernetes.io/projected/b0b66f79-c7ee-40c3-a026-0c42a0648f11-kube-api-access-wlnvd\") pod \"nova-cell1-1d98-account-create-update-vvbxv\" (UID: \"b0b66f79-c7ee-40c3-a026-0c42a0648f11\") " pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.135532 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0b66f79-c7ee-40c3-a026-0c42a0648f11-operator-scripts\") pod \"nova-cell1-1d98-account-create-update-vvbxv\" (UID: \"b0b66f79-c7ee-40c3-a026-0c42a0648f11\") " pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.164157 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00996e6f-a64c-4f6b-85ba-3c1ed4284b8c" path="/var/lib/kubelet/pods/00996e6f-a64c-4f6b-85ba-3c1ed4284b8c/volumes" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.185470 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwl4c" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.195192 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.239228 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlnvd\" (UniqueName: \"kubernetes.io/projected/b0b66f79-c7ee-40c3-a026-0c42a0648f11-kube-api-access-wlnvd\") pod \"nova-cell1-1d98-account-create-update-vvbxv\" (UID: \"b0b66f79-c7ee-40c3-a026-0c42a0648f11\") " pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.239755 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0b66f79-c7ee-40c3-a026-0c42a0648f11-operator-scripts\") pod \"nova-cell1-1d98-account-create-update-vvbxv\" (UID: \"b0b66f79-c7ee-40c3-a026-0c42a0648f11\") " pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.240714 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0b66f79-c7ee-40c3-a026-0c42a0648f11-operator-scripts\") pod \"nova-cell1-1d98-account-create-update-vvbxv\" (UID: \"b0b66f79-c7ee-40c3-a026-0c42a0648f11\") " pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.275249 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlnvd\" (UniqueName: \"kubernetes.io/projected/b0b66f79-c7ee-40c3-a026-0c42a0648f11-kube-api-access-wlnvd\") pod \"nova-cell1-1d98-account-create-update-vvbxv\" (UID: \"b0b66f79-c7ee-40c3-a026-0c42a0648f11\") " pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.329754 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.775277 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qc5ff"] Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.815330 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2vzqh"] Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.978933 4708 generic.go:334] "Generic (PLEG): container finished" podID="ac8c72be-4723-43be-8537-072adeb1e924" containerID="e2c3b178ed0e541d253f458a7688daacddfaec3e1b2fcd36a923a2e80a254119" exitCode=0 Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.979494 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac8c72be-4723-43be-8537-072adeb1e924","Type":"ContainerDied","Data":"e2c3b178ed0e541d253f458a7688daacddfaec3e1b2fcd36a923a2e80a254119"} Mar 20 16:22:50 crc kubenswrapper[4708]: I0320 16:22:50.985331 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0234-account-create-update-krdbf"] Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.061548 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wwl4c"] Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.061833 4708 generic.go:334] "Generic (PLEG): container finished" podID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerID="815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524" exitCode=0 Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.061857 4708 generic.go:334] "Generic (PLEG): container finished" podID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerID="f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a" exitCode=2 Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.061864 4708 generic.go:334] "Generic (PLEG): container finished" podID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerID="b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5" exitCode=0 Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.061879 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20739bf7-b966-4dbd-8846-4bda838c5da4","Type":"ContainerDied","Data":"815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524"} Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.061897 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20739bf7-b966-4dbd-8846-4bda838c5da4","Type":"ContainerDied","Data":"f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a"} Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.061909 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20739bf7-b966-4dbd-8846-4bda838c5da4","Type":"ContainerDied","Data":"b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5"} Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.093226 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.245027 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac8c72be-4723-43be-8537-072adeb1e924-logs\") pod \"ac8c72be-4723-43be-8537-072adeb1e924\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.245084 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-public-tls-certs\") pod \"ac8c72be-4723-43be-8537-072adeb1e924\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.245106 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxfst\" (UniqueName: \"kubernetes.io/projected/ac8c72be-4723-43be-8537-072adeb1e924-kube-api-access-xxfst\") pod \"ac8c72be-4723-43be-8537-072adeb1e924\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.245123 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ac8c72be-4723-43be-8537-072adeb1e924\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.245142 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-config-data\") pod \"ac8c72be-4723-43be-8537-072adeb1e924\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.245171 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-combined-ca-bundle\") pod \"ac8c72be-4723-43be-8537-072adeb1e924\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.245227 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-scripts\") pod \"ac8c72be-4723-43be-8537-072adeb1e924\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.245284 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac8c72be-4723-43be-8537-072adeb1e924-httpd-run\") pod \"ac8c72be-4723-43be-8537-072adeb1e924\" (UID: \"ac8c72be-4723-43be-8537-072adeb1e924\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.245610 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8c72be-4723-43be-8537-072adeb1e924-logs" (OuterVolumeSpecName: "logs") pod "ac8c72be-4723-43be-8537-072adeb1e924" (UID: "ac8c72be-4723-43be-8537-072adeb1e924"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.245832 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8c72be-4723-43be-8537-072adeb1e924-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ac8c72be-4723-43be-8537-072adeb1e924" (UID: "ac8c72be-4723-43be-8537-072adeb1e924"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.251024 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "ac8c72be-4723-43be-8537-072adeb1e924" (UID: "ac8c72be-4723-43be-8537-072adeb1e924"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.255325 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-scripts" (OuterVolumeSpecName: "scripts") pod "ac8c72be-4723-43be-8537-072adeb1e924" (UID: "ac8c72be-4723-43be-8537-072adeb1e924"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.255476 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8c72be-4723-43be-8537-072adeb1e924-kube-api-access-xxfst" (OuterVolumeSpecName: "kube-api-access-xxfst") pod "ac8c72be-4723-43be-8537-072adeb1e924" (UID: "ac8c72be-4723-43be-8537-072adeb1e924"). InnerVolumeSpecName "kube-api-access-xxfst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.306540 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac8c72be-4723-43be-8537-072adeb1e924" (UID: "ac8c72be-4723-43be-8537-072adeb1e924"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.347569 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac8c72be-4723-43be-8537-072adeb1e924-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.347619 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxfst\" (UniqueName: \"kubernetes.io/projected/ac8c72be-4723-43be-8537-072adeb1e924-kube-api-access-xxfst\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.347650 4708 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.347665 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.347711 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.347731 4708 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac8c72be-4723-43be-8537-072adeb1e924-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.408728 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ac8c72be-4723-43be-8537-072adeb1e924" (UID: "ac8c72be-4723-43be-8537-072adeb1e924"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.414330 4708 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.449486 4708 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.449519 4708 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.464334 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-config-data" (OuterVolumeSpecName: "config-data") pod "ac8c72be-4723-43be-8537-072adeb1e924" (UID: "ac8c72be-4723-43be-8537-072adeb1e924"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.483931 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f9fb-account-create-update-ngw6v"] Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.565058 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1d98-account-create-update-vvbxv"] Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.566799 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8c72be-4723-43be-8537-072adeb1e924-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.738314 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.774719 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20739bf7-b966-4dbd-8846-4bda838c5da4-run-httpd\") pod \"20739bf7-b966-4dbd-8846-4bda838c5da4\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.774893 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-scripts\") pod \"20739bf7-b966-4dbd-8846-4bda838c5da4\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.774983 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qczlv\" (UniqueName: \"kubernetes.io/projected/20739bf7-b966-4dbd-8846-4bda838c5da4-kube-api-access-qczlv\") pod \"20739bf7-b966-4dbd-8846-4bda838c5da4\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.775101 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-sg-core-conf-yaml\") pod \"20739bf7-b966-4dbd-8846-4bda838c5da4\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.775233 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-combined-ca-bundle\") pod \"20739bf7-b966-4dbd-8846-4bda838c5da4\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.775296 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20739bf7-b966-4dbd-8846-4bda838c5da4-log-httpd\") pod \"20739bf7-b966-4dbd-8846-4bda838c5da4\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.775326 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-config-data\") pod \"20739bf7-b966-4dbd-8846-4bda838c5da4\" (UID: \"20739bf7-b966-4dbd-8846-4bda838c5da4\") " Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.776512 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20739bf7-b966-4dbd-8846-4bda838c5da4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "20739bf7-b966-4dbd-8846-4bda838c5da4" (UID: "20739bf7-b966-4dbd-8846-4bda838c5da4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.778461 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20739bf7-b966-4dbd-8846-4bda838c5da4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "20739bf7-b966-4dbd-8846-4bda838c5da4" (UID: "20739bf7-b966-4dbd-8846-4bda838c5da4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.798153 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20739bf7-b966-4dbd-8846-4bda838c5da4-kube-api-access-qczlv" (OuterVolumeSpecName: "kube-api-access-qczlv") pod "20739bf7-b966-4dbd-8846-4bda838c5da4" (UID: "20739bf7-b966-4dbd-8846-4bda838c5da4"). InnerVolumeSpecName "kube-api-access-qczlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.798258 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-scripts" (OuterVolumeSpecName: "scripts") pod "20739bf7-b966-4dbd-8846-4bda838c5da4" (UID: "20739bf7-b966-4dbd-8846-4bda838c5da4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.878317 4708 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20739bf7-b966-4dbd-8846-4bda838c5da4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.878698 4708 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20739bf7-b966-4dbd-8846-4bda838c5da4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.878712 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:51 crc kubenswrapper[4708]: I0320 16:22:51.878720 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qczlv\" (UniqueName: \"kubernetes.io/projected/20739bf7-b966-4dbd-8846-4bda838c5da4-kube-api-access-qczlv\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.076950 4708 generic.go:334] "Generic (PLEG): container finished" podID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerID="b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92" exitCode=0 Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.077012 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.077039 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20739bf7-b966-4dbd-8846-4bda838c5da4","Type":"ContainerDied","Data":"b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.077075 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20739bf7-b966-4dbd-8846-4bda838c5da4","Type":"ContainerDied","Data":"11e20719508eb00580072cc1797cc1f6e9e891987d981e6753addb9045972c97"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.077097 4708 scope.go:117] "RemoveContainer" containerID="815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.079724 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" event={"ID":"3116dd2a-d2d0-46cf-837d-56d29a7e116f","Type":"ContainerStarted","Data":"8c742ec44123d2e745811d0062cb7e67b1579a19217005f1762121882e9c950e"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.089166 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac8c72be-4723-43be-8537-072adeb1e924","Type":"ContainerDied","Data":"075415dd382ea42123ae040feaa93419172162920624edf385aa9e09628d8118"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.091505 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b5c84402-2274-4aa9-a456-4b936ba6b94e","Type":"ContainerStarted","Data":"42fa6df80535fa82552830e78cac093f56112f921954b4fe325b4edcf2cc89f0"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.094009 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.098152 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0234-account-create-update-krdbf" event={"ID":"394a21a5-81ce-4b43-8642-70a03a4a0685","Type":"ContainerStarted","Data":"679b487bf3840839d41bd9c30760c28a8920609d2567fe352aea2f7b8faf3e1b"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.100386 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" event={"ID":"b0b66f79-c7ee-40c3-a026-0c42a0648f11","Type":"ContainerStarted","Data":"6e30842eb23ce669217afe90b8471485e9ff3f11e5eda2fc364c8b16bb1d9def"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.109446 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2vzqh" event={"ID":"3f246f8c-2e08-400c-af52-746be688f708","Type":"ContainerStarted","Data":"4fdeea389bf333390f5cf82b8084e149563364b6d640b9454792c9f0e2458f82"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.109495 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2vzqh" event={"ID":"3f246f8c-2e08-400c-af52-746be688f708","Type":"ContainerStarted","Data":"d0fb23daa0217fe61d433c5848b140ca0dc3937a13f6a7236a57bbca846e28f1"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.127693 4708 scope.go:117] "RemoveContainer" containerID="f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.130618 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" podStartSLOduration=3.130600789 podStartE2EDuration="3.130600789s" podCreationTimestamp="2026-03-20 16:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:52.104378901 +0000 UTC m=+1326.778715616" watchObservedRunningTime="2026-03-20 16:22:52.130600789 +0000 UTC m=+1326.804937504" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.152836 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-2vzqh" podStartSLOduration=3.152817807 podStartE2EDuration="3.152817807s" podCreationTimestamp="2026-03-20 16:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:52.145368762 +0000 UTC m=+1326.819705497" watchObservedRunningTime="2026-03-20 16:22:52.152817807 +0000 UTC m=+1326.827154522" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.157076 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwl4c" event={"ID":"630fc775-bda7-45ac-9852-650855479072","Type":"ContainerStarted","Data":"a9e1f8effa424893dcfd6f68b1d52bfcda3c6d15bfe308378dbc4526a749d5c2"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.157116 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwl4c" event={"ID":"630fc775-bda7-45ac-9852-650855479072","Type":"ContainerStarted","Data":"1191a7dbcaab59442e6a73e474c443a08db615bbe119ce8fe4a62c0ee4e1c8b0"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.159658 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "20739bf7-b966-4dbd-8846-4bda838c5da4" (UID: "20739bf7-b966-4dbd-8846-4bda838c5da4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.165381 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qc5ff" event={"ID":"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc","Type":"ContainerStarted","Data":"687b9c422e841e69520bc01f976381a4e90d6d634600469ca2f0b8c4d8dfdcf9"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.165425 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qc5ff" event={"ID":"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc","Type":"ContainerStarted","Data":"5cda86dd894e99a399e22155ff03b2fa23ce5df91932ea94d83976ad106bc3a1"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.170955 4708 generic.go:334] "Generic (PLEG): container finished" podID="0d2a3b0e-e329-43b5-937a-107b0bea8941" containerID="a966cb1ff98f8190c957d01cf1bd095dd553069e7a18c2b3287c6bb6a471aaaf" exitCode=0 Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.171019 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d2a3b0e-e329-43b5-937a-107b0bea8941","Type":"ContainerDied","Data":"a966cb1ff98f8190c957d01cf1bd095dd553069e7a18c2b3287c6bb6a471aaaf"} Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.179223 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-wwl4c" podStartSLOduration=3.179203899 podStartE2EDuration="3.179203899s" podCreationTimestamp="2026-03-20 16:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:52.164548618 +0000 UTC m=+1326.838885333" watchObservedRunningTime="2026-03-20 16:22:52.179203899 +0000 UTC m=+1326.853540614" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.189239 4708 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.216981 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.231185 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.242493 4708 scope.go:117] "RemoveContainer" containerID="b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.242624 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:52 crc kubenswrapper[4708]: E0320 16:22:52.243085 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8c72be-4723-43be-8537-072adeb1e924" containerName="glance-httpd" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.243098 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8c72be-4723-43be-8537-072adeb1e924" containerName="glance-httpd" Mar 20 16:22:52 crc kubenswrapper[4708]: E0320 16:22:52.243108 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="ceilometer-central-agent" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.243114 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="ceilometer-central-agent" Mar 20 16:22:52 crc kubenswrapper[4708]: E0320 16:22:52.243123 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="ceilometer-notification-agent" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.243129 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="ceilometer-notification-agent" Mar 20 16:22:52 crc kubenswrapper[4708]: E0320 16:22:52.243147 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="proxy-httpd" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.243153 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="proxy-httpd" Mar 20 16:22:52 crc kubenswrapper[4708]: E0320 16:22:52.243166 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8c72be-4723-43be-8537-072adeb1e924" containerName="glance-log" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.243173 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8c72be-4723-43be-8537-072adeb1e924" containerName="glance-log" Mar 20 16:22:52 crc kubenswrapper[4708]: E0320 16:22:52.243204 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="sg-core" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.243210 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="sg-core" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.243384 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="ceilometer-central-agent" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.243396 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="ceilometer-notification-agent" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.243404 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="proxy-httpd" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.243417 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" containerName="sg-core" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.243426 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8c72be-4723-43be-8537-072adeb1e924" containerName="glance-httpd" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.243441 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8c72be-4723-43be-8537-072adeb1e924" containerName="glance-log" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.244402 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.248944 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.253294 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-qc5ff" podStartSLOduration=3.253273656 podStartE2EDuration="3.253273656s" podCreationTimestamp="2026-03-20 16:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:52.216955822 +0000 UTC m=+1326.891292537" watchObservedRunningTime="2026-03-20 16:22:52.253273656 +0000 UTC m=+1326.927610381" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.255445 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.286626 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.297716 4708 scope.go:117] "RemoveContainer" containerID="b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.326776 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20739bf7-b966-4dbd-8846-4bda838c5da4" (UID: "20739bf7-b966-4dbd-8846-4bda838c5da4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.395112 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt59b\" (UniqueName: \"kubernetes.io/projected/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-kube-api-access-jt59b\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.395467 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-config-data\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.395515 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-scripts\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.395571 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.395605 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.395622 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.396100 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.396184 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-logs\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.396256 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.415838 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-config-data" (OuterVolumeSpecName: "config-data") pod "20739bf7-b966-4dbd-8846-4bda838c5da4" (UID: "20739bf7-b966-4dbd-8846-4bda838c5da4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.499103 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.499165 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.499189 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.499227 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.499272 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-logs\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.499318 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt59b\" (UniqueName: \"kubernetes.io/projected/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-kube-api-access-jt59b\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.499368 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-config-data\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.499428 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-scripts\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.499488 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20739bf7-b966-4dbd-8846-4bda838c5da4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.502005 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.502366 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-logs\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.502454 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.508105 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.509410 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-config-data\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.510138 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.510437 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-scripts\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.530530 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt59b\" (UniqueName: \"kubernetes.io/projected/2842ffef-b14c-48e2-8cf5-cb9aee2d1131-kube-api-access-jt59b\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.559821 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"2842ffef-b14c-48e2-8cf5-cb9aee2d1131\") " pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.586845 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.767765 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6b6ff5cbbd-kjfxp" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.837008 4708 scope.go:117] "RemoveContainer" containerID="815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524" Mar 20 16:22:52 crc kubenswrapper[4708]: E0320 16:22:52.838304 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524\": container with ID starting with 815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524 not found: ID does not exist" containerID="815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.838378 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524"} err="failed to get container status \"815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524\": rpc error: code = NotFound desc = could not find container \"815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524\": container with ID starting with 815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524 not found: ID does not exist" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.838407 4708 scope.go:117] "RemoveContainer" containerID="f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.838803 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:52 crc kubenswrapper[4708]: E0320 16:22:52.838943 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a\": container with ID starting with f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a not found: ID does not exist" containerID="f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.838978 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a"} err="failed to get container status \"f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a\": rpc error: code = NotFound desc = could not find container \"f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a\": container with ID starting with f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a not found: ID does not exist" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.838999 4708 scope.go:117] "RemoveContainer" containerID="b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5" Mar 20 16:22:52 crc kubenswrapper[4708]: E0320 16:22:52.839282 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5\": container with ID starting with b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5 not found: ID does not exist" containerID="b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.839301 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5"} err="failed to get container status \"b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5\": rpc error: code = NotFound desc = could not find container \"b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5\": container with ID starting with b92488310cf2d13ea387a3d1495c1e24ba90ac2317d2667b28547954299b05f5 not found: ID does not exist" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.839314 4708 scope.go:117] "RemoveContainer" containerID="b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92" Mar 20 16:22:52 crc kubenswrapper[4708]: E0320 16:22:52.853249 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92\": container with ID starting with b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92 not found: ID does not exist" containerID="b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.853302 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92"} err="failed to get container status \"b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92\": rpc error: code = NotFound desc = could not find container \"b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92\": container with ID starting with b235b3433d8e3e92f5c184e2715e692cd225aab74b2665d9b2d20da7e6527a92 not found: ID does not exist" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.853335 4708 scope.go:117] "RemoveContainer" containerID="e2c3b178ed0e541d253f458a7688daacddfaec3e1b2fcd36a923a2e80a254119" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.868610 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.884628 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.910690 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:52 crc kubenswrapper[4708]: E0320 16:22:52.913131 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2a3b0e-e329-43b5-937a-107b0bea8941" containerName="glance-httpd" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.913158 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2a3b0e-e329-43b5-937a-107b0bea8941" containerName="glance-httpd" Mar 20 16:22:52 crc kubenswrapper[4708]: E0320 16:22:52.915001 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2a3b0e-e329-43b5-937a-107b0bea8941" containerName="glance-log" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.915040 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2a3b0e-e329-43b5-937a-107b0bea8941" containerName="glance-log" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.915271 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2a3b0e-e329-43b5-937a-107b0bea8941" containerName="glance-log" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.915285 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2a3b0e-e329-43b5-937a-107b0bea8941" containerName="glance-httpd" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.917027 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-config-data\") pod \"0d2a3b0e-e329-43b5-937a-107b0bea8941\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.917111 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-internal-tls-certs\") pod \"0d2a3b0e-e329-43b5-937a-107b0bea8941\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.917208 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"0d2a3b0e-e329-43b5-937a-107b0bea8941\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.917230 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qhfv\" (UniqueName: \"kubernetes.io/projected/0d2a3b0e-e329-43b5-937a-107b0bea8941-kube-api-access-7qhfv\") pod \"0d2a3b0e-e329-43b5-937a-107b0bea8941\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.917299 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-combined-ca-bundle\") pod \"0d2a3b0e-e329-43b5-937a-107b0bea8941\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.917331 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3b0e-e329-43b5-937a-107b0bea8941-httpd-run\") pod \"0d2a3b0e-e329-43b5-937a-107b0bea8941\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.917376 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3b0e-e329-43b5-937a-107b0bea8941-logs\") pod \"0d2a3b0e-e329-43b5-937a-107b0bea8941\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.917474 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-scripts\") pod \"0d2a3b0e-e329-43b5-937a-107b0bea8941\" (UID: \"0d2a3b0e-e329-43b5-937a-107b0bea8941\") " Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.917938 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.923650 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.923701 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d2a3b0e-e329-43b5-937a-107b0bea8941-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0d2a3b0e-e329-43b5-937a-107b0bea8941" (UID: "0d2a3b0e-e329-43b5-937a-107b0bea8941"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.923960 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.925870 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d2a3b0e-e329-43b5-937a-107b0bea8941-logs" (OuterVolumeSpecName: "logs") pod "0d2a3b0e-e329-43b5-937a-107b0bea8941" (UID: "0d2a3b0e-e329-43b5-937a-107b0bea8941"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.926511 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-scripts" (OuterVolumeSpecName: "scripts") pod "0d2a3b0e-e329-43b5-937a-107b0bea8941" (UID: "0d2a3b0e-e329-43b5-937a-107b0bea8941"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.932201 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "0d2a3b0e-e329-43b5-937a-107b0bea8941" (UID: "0d2a3b0e-e329-43b5-937a-107b0bea8941"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.937437 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d2a3b0e-e329-43b5-937a-107b0bea8941-kube-api-access-7qhfv" (OuterVolumeSpecName: "kube-api-access-7qhfv") pod "0d2a3b0e-e329-43b5-937a-107b0bea8941" (UID: "0d2a3b0e-e329-43b5-937a-107b0bea8941"). InnerVolumeSpecName "kube-api-access-7qhfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.941898 4708 scope.go:117] "RemoveContainer" containerID="81ccf63c44b61beb53e56f9b9bbb79ac7b584bc9050d2b8ea6e95ac0d0995df8" Mar 20 16:22:52 crc kubenswrapper[4708]: I0320 16:22:52.952431 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.020732 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.020790 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-scripts\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.020826 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2747a4ec-c212-4f8e-9266-ffb68a041ef6-log-httpd\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.020866 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xjg6\" (UniqueName: \"kubernetes.io/projected/2747a4ec-c212-4f8e-9266-ffb68a041ef6-kube-api-access-5xjg6\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.020916 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.021003 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-config-data\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.021067 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2747a4ec-c212-4f8e-9266-ffb68a041ef6-run-httpd\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.021091 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d2a3b0e-e329-43b5-937a-107b0bea8941" (UID: "0d2a3b0e-e329-43b5-937a-107b0bea8941"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.021224 4708 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.021247 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qhfv\" (UniqueName: \"kubernetes.io/projected/0d2a3b0e-e329-43b5-937a-107b0bea8941-kube-api-access-7qhfv\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.021262 4708 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3b0e-e329-43b5-937a-107b0bea8941-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.021275 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d2a3b0e-e329-43b5-937a-107b0bea8941-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.021286 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.022608 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-config-data" (OuterVolumeSpecName: "config-data") pod "0d2a3b0e-e329-43b5-937a-107b0bea8941" (UID: "0d2a3b0e-e329-43b5-937a-107b0bea8941"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.026817 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0d2a3b0e-e329-43b5-937a-107b0bea8941" (UID: "0d2a3b0e-e329-43b5-937a-107b0bea8941"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.061445 4708 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.123081 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-config-data\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.123516 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2747a4ec-c212-4f8e-9266-ffb68a041ef6-run-httpd\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.123591 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.123637 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-scripts\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.123734 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2747a4ec-c212-4f8e-9266-ffb68a041ef6-log-httpd\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.123795 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xjg6\" (UniqueName: \"kubernetes.io/projected/2747a4ec-c212-4f8e-9266-ffb68a041ef6-kube-api-access-5xjg6\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.123867 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.123933 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.123947 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.123956 4708 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2a3b0e-e329-43b5-937a-107b0bea8941-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.123965 4708 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.128501 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2747a4ec-c212-4f8e-9266-ffb68a041ef6-log-httpd\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.128919 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2747a4ec-c212-4f8e-9266-ffb68a041ef6-run-httpd\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.129303 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-config-data\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.129519 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-scripts\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.130527 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.131420 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.153598 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xjg6\" (UniqueName: \"kubernetes.io/projected/2747a4ec-c212-4f8e-9266-ffb68a041ef6-kube-api-access-5xjg6\") pod \"ceilometer-0\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.227582 4708 generic.go:334] "Generic (PLEG): container finished" podID="b0b66f79-c7ee-40c3-a026-0c42a0648f11" containerID="ef1d3bb93632a5560ad920acab64b1d767f12e941660c4b340e61063ebd5a674" exitCode=0 Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.228033 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" event={"ID":"b0b66f79-c7ee-40c3-a026-0c42a0648f11","Type":"ContainerDied","Data":"ef1d3bb93632a5560ad920acab64b1d767f12e941660c4b340e61063ebd5a674"} Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.237630 4708 generic.go:334] "Generic (PLEG): container finished" podID="630fc775-bda7-45ac-9852-650855479072" containerID="a9e1f8effa424893dcfd6f68b1d52bfcda3c6d15bfe308378dbc4526a749d5c2" exitCode=0 Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.237726 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwl4c" event={"ID":"630fc775-bda7-45ac-9852-650855479072","Type":"ContainerDied","Data":"a9e1f8effa424893dcfd6f68b1d52bfcda3c6d15bfe308378dbc4526a749d5c2"} Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.250010 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.251248 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d2a3b0e-e329-43b5-937a-107b0bea8941","Type":"ContainerDied","Data":"7c30406650e08d2764e1990ec20d4d84d89333a4fa6b608b8064ab4c0d12493b"} Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.251338 4708 scope.go:117] "RemoveContainer" containerID="a966cb1ff98f8190c957d01cf1bd095dd553069e7a18c2b3287c6bb6a471aaaf" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.269934 4708 generic.go:334] "Generic (PLEG): container finished" podID="394a21a5-81ce-4b43-8642-70a03a4a0685" containerID="01fc1382778da1917738bf00c90b8ce2d1e7ef86718a5766543790d531d34d0a" exitCode=0 Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.270056 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0234-account-create-update-krdbf" event={"ID":"394a21a5-81ce-4b43-8642-70a03a4a0685","Type":"ContainerDied","Data":"01fc1382778da1917738bf00c90b8ce2d1e7ef86718a5766543790d531d34d0a"} Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.273433 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.283755 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.296039 4708 generic.go:334] "Generic (PLEG): container finished" podID="3116dd2a-d2d0-46cf-837d-56d29a7e116f" containerID="16e1a51a2faac0c5273d327fd38ea9b0770a0d58a7a65ec9df7fb3a1ef1f562a" exitCode=0 Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.296174 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" event={"ID":"3116dd2a-d2d0-46cf-837d-56d29a7e116f","Type":"ContainerDied","Data":"16e1a51a2faac0c5273d327fd38ea9b0770a0d58a7a65ec9df7fb3a1ef1f562a"} Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.318774 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.319086 4708 scope.go:117] "RemoveContainer" containerID="fe4ed75fcf01f154ccd411b7819fcfe28253b3198ad4de5b3b4d33a84578e814" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.330119 4708 generic.go:334] "Generic (PLEG): container finished" podID="3f246f8c-2e08-400c-af52-746be688f708" containerID="4fdeea389bf333390f5cf82b8084e149563364b6d640b9454792c9f0e2458f82" exitCode=0 Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.330440 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2vzqh" event={"ID":"3f246f8c-2e08-400c-af52-746be688f708","Type":"ContainerDied","Data":"4fdeea389bf333390f5cf82b8084e149563364b6d640b9454792c9f0e2458f82"} Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.346004 4708 generic.go:334] "Generic (PLEG): container finished" podID="22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc" containerID="687b9c422e841e69520bc01f976381a4e90d6d634600469ca2f0b8c4d8dfdcf9" exitCode=0 Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.346115 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qc5ff" event={"ID":"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc","Type":"ContainerDied","Data":"687b9c422e841e69520bc01f976381a4e90d6d634600469ca2f0b8c4d8dfdcf9"} Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.387710 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.396587 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b5c84402-2274-4aa9-a456-4b936ba6b94e","Type":"ContainerStarted","Data":"f9af8b67c8bcff069cc1394fb5a9f2734d62d645d50dd83d386494f46264aaf6"} Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.397754 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.401615 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.430610 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.432753 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.438365 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.438925 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.440661 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.505889 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.505867008 podStartE2EDuration="5.505867008s" podCreationTimestamp="2026-03-20 16:22:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:53.473057789 +0000 UTC m=+1328.147394514" watchObservedRunningTime="2026-03-20 16:22:53.505867008 +0000 UTC m=+1328.180203723" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.542251 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.542330 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c01a244f-263d-435a-90d8-35bdb111bb6a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.542363 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c01a244f-263d-435a-90d8-35bdb111bb6a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.542381 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c01a244f-263d-435a-90d8-35bdb111bb6a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.542415 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc2md\" (UniqueName: \"kubernetes.io/projected/c01a244f-263d-435a-90d8-35bdb111bb6a-kube-api-access-sc2md\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.542448 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01a244f-263d-435a-90d8-35bdb111bb6a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.542527 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01a244f-263d-435a-90d8-35bdb111bb6a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.542564 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c01a244f-263d-435a-90d8-35bdb111bb6a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.644951 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.645037 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c01a244f-263d-435a-90d8-35bdb111bb6a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.645081 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c01a244f-263d-435a-90d8-35bdb111bb6a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.645098 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c01a244f-263d-435a-90d8-35bdb111bb6a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.645523 4708 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.645561 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c01a244f-263d-435a-90d8-35bdb111bb6a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.645867 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c01a244f-263d-435a-90d8-35bdb111bb6a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.647417 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc2md\" (UniqueName: \"kubernetes.io/projected/c01a244f-263d-435a-90d8-35bdb111bb6a-kube-api-access-sc2md\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.647472 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01a244f-263d-435a-90d8-35bdb111bb6a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.647547 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01a244f-263d-435a-90d8-35bdb111bb6a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.647580 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c01a244f-263d-435a-90d8-35bdb111bb6a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.653287 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c01a244f-263d-435a-90d8-35bdb111bb6a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.660382 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c01a244f-263d-435a-90d8-35bdb111bb6a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.660986 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01a244f-263d-435a-90d8-35bdb111bb6a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.661048 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01a244f-263d-435a-90d8-35bdb111bb6a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.668953 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc2md\" (UniqueName: \"kubernetes.io/projected/c01a244f-263d-435a-90d8-35bdb111bb6a-kube-api-access-sc2md\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.678944 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c01a244f-263d-435a-90d8-35bdb111bb6a\") " pod="openstack/glance-default-internal-api-0" Mar 20 16:22:53 crc kubenswrapper[4708]: I0320 16:22:53.836904 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.026922 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.135723 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d2a3b0e-e329-43b5-937a-107b0bea8941" path="/var/lib/kubelet/pods/0d2a3b0e-e329-43b5-937a-107b0bea8941/volumes" Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.136910 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20739bf7-b966-4dbd-8846-4bda838c5da4" path="/var/lib/kubelet/pods/20739bf7-b966-4dbd-8846-4bda838c5da4/volumes" Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.138202 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8c72be-4723-43be-8537-072adeb1e924" path="/var/lib/kubelet/pods/ac8c72be-4723-43be-8537-072adeb1e924/volumes" Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.431072 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2747a4ec-c212-4f8e-9266-ffb68a041ef6","Type":"ContainerStarted","Data":"54df14489dd4af3340f511ef38a58e4d186cda0ab130925bdc277f38781652b5"} Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.435010 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.435378 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2842ffef-b14c-48e2-8cf5-cb9aee2d1131","Type":"ContainerStarted","Data":"fcb5ee7c0856453c683250e4a936d35c9471a0e4d673c3722d52550e40ae195c"} Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.435415 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2842ffef-b14c-48e2-8cf5-cb9aee2d1131","Type":"ContainerStarted","Data":"6cbf59b188ef0b997ef3ddccbd47c9fc27dacfa68da371f7f7cf5163e42dbe23"} Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.900529 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.981818 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlnvd\" (UniqueName: \"kubernetes.io/projected/b0b66f79-c7ee-40c3-a026-0c42a0648f11-kube-api-access-wlnvd\") pod \"b0b66f79-c7ee-40c3-a026-0c42a0648f11\" (UID: \"b0b66f79-c7ee-40c3-a026-0c42a0648f11\") " Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.981993 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0b66f79-c7ee-40c3-a026-0c42a0648f11-operator-scripts\") pod \"b0b66f79-c7ee-40c3-a026-0c42a0648f11\" (UID: \"b0b66f79-c7ee-40c3-a026-0c42a0648f11\") " Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.983249 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b66f79-c7ee-40c3-a026-0c42a0648f11-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0b66f79-c7ee-40c3-a026-0c42a0648f11" (UID: "b0b66f79-c7ee-40c3-a026-0c42a0648f11"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:54 crc kubenswrapper[4708]: I0320 16:22:54.992169 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b66f79-c7ee-40c3-a026-0c42a0648f11-kube-api-access-wlnvd" (OuterVolumeSpecName: "kube-api-access-wlnvd") pod "b0b66f79-c7ee-40c3-a026-0c42a0648f11" (UID: "b0b66f79-c7ee-40c3-a026-0c42a0648f11"). InnerVolumeSpecName "kube-api-access-wlnvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.105224 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlnvd\" (UniqueName: \"kubernetes.io/projected/b0b66f79-c7ee-40c3-a026-0c42a0648f11-kube-api-access-wlnvd\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.105253 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0b66f79-c7ee-40c3-a026-0c42a0648f11-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.275775 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwl4c" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.286056 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.358070 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0234-account-create-update-krdbf" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.416759 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm5rg\" (UniqueName: \"kubernetes.io/projected/3116dd2a-d2d0-46cf-837d-56d29a7e116f-kube-api-access-mm5rg\") pod \"3116dd2a-d2d0-46cf-837d-56d29a7e116f\" (UID: \"3116dd2a-d2d0-46cf-837d-56d29a7e116f\") " Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.416815 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqdkm\" (UniqueName: \"kubernetes.io/projected/630fc775-bda7-45ac-9852-650855479072-kube-api-access-nqdkm\") pod \"630fc775-bda7-45ac-9852-650855479072\" (UID: \"630fc775-bda7-45ac-9852-650855479072\") " Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.416998 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7hzv\" (UniqueName: \"kubernetes.io/projected/394a21a5-81ce-4b43-8642-70a03a4a0685-kube-api-access-s7hzv\") pod \"394a21a5-81ce-4b43-8642-70a03a4a0685\" (UID: \"394a21a5-81ce-4b43-8642-70a03a4a0685\") " Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.417237 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3116dd2a-d2d0-46cf-837d-56d29a7e116f-operator-scripts\") pod \"3116dd2a-d2d0-46cf-837d-56d29a7e116f\" (UID: \"3116dd2a-d2d0-46cf-837d-56d29a7e116f\") " Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.417265 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/394a21a5-81ce-4b43-8642-70a03a4a0685-operator-scripts\") pod \"394a21a5-81ce-4b43-8642-70a03a4a0685\" (UID: \"394a21a5-81ce-4b43-8642-70a03a4a0685\") " Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.417290 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630fc775-bda7-45ac-9852-650855479072-operator-scripts\") pod \"630fc775-bda7-45ac-9852-650855479072\" (UID: \"630fc775-bda7-45ac-9852-650855479072\") " Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.418764 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630fc775-bda7-45ac-9852-650855479072-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "630fc775-bda7-45ac-9852-650855479072" (UID: "630fc775-bda7-45ac-9852-650855479072"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.420825 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3116dd2a-d2d0-46cf-837d-56d29a7e116f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3116dd2a-d2d0-46cf-837d-56d29a7e116f" (UID: "3116dd2a-d2d0-46cf-837d-56d29a7e116f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.422484 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/394a21a5-81ce-4b43-8642-70a03a4a0685-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "394a21a5-81ce-4b43-8642-70a03a4a0685" (UID: "394a21a5-81ce-4b43-8642-70a03a4a0685"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.430352 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3116dd2a-d2d0-46cf-837d-56d29a7e116f-kube-api-access-mm5rg" (OuterVolumeSpecName: "kube-api-access-mm5rg") pod "3116dd2a-d2d0-46cf-837d-56d29a7e116f" (UID: "3116dd2a-d2d0-46cf-837d-56d29a7e116f"). InnerVolumeSpecName "kube-api-access-mm5rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.431429 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630fc775-bda7-45ac-9852-650855479072-kube-api-access-nqdkm" (OuterVolumeSpecName: "kube-api-access-nqdkm") pod "630fc775-bda7-45ac-9852-650855479072" (UID: "630fc775-bda7-45ac-9852-650855479072"). InnerVolumeSpecName "kube-api-access-nqdkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.435195 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394a21a5-81ce-4b43-8642-70a03a4a0685-kube-api-access-s7hzv" (OuterVolumeSpecName: "kube-api-access-s7hzv") pod "394a21a5-81ce-4b43-8642-70a03a4a0685" (UID: "394a21a5-81ce-4b43-8642-70a03a4a0685"). InnerVolumeSpecName "kube-api-access-s7hzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.461144 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2vzqh" event={"ID":"3f246f8c-2e08-400c-af52-746be688f708","Type":"ContainerDied","Data":"d0fb23daa0217fe61d433c5848b140ca0dc3937a13f6a7236a57bbca846e28f1"} Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.461208 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0fb23daa0217fe61d433c5848b140ca0dc3937a13f6a7236a57bbca846e28f1" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.467146 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwl4c" event={"ID":"630fc775-bda7-45ac-9852-650855479072","Type":"ContainerDied","Data":"1191a7dbcaab59442e6a73e474c443a08db615bbe119ce8fe4a62c0ee4e1c8b0"} Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.467229 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1191a7dbcaab59442e6a73e474c443a08db615bbe119ce8fe4a62c0ee4e1c8b0" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.467291 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwl4c" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.483355 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c01a244f-263d-435a-90d8-35bdb111bb6a","Type":"ContainerStarted","Data":"abf487aeb259304b05251e5c0503d8d1f9fd5e0507f61d0459b5fe5d0aefece0"} Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.490616 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2842ffef-b14c-48e2-8cf5-cb9aee2d1131","Type":"ContainerStarted","Data":"b4308c3b40482cce9e5709db50ddd6b822f17eb9aa2b52b7235f3f895ae1a087"} Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.496228 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" event={"ID":"3116dd2a-d2d0-46cf-837d-56d29a7e116f","Type":"ContainerDied","Data":"8c742ec44123d2e745811d0062cb7e67b1579a19217005f1762121882e9c950e"} Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.496278 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c742ec44123d2e745811d0062cb7e67b1579a19217005f1762121882e9c950e" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.496615 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f9fb-account-create-update-ngw6v" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.504273 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0234-account-create-update-krdbf" event={"ID":"394a21a5-81ce-4b43-8642-70a03a4a0685","Type":"ContainerDied","Data":"679b487bf3840839d41bd9c30760c28a8920609d2567fe352aea2f7b8faf3e1b"} Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.504454 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="679b487bf3840839d41bd9c30760c28a8920609d2567fe352aea2f7b8faf3e1b" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.504356 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0234-account-create-update-krdbf" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.507745 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2747a4ec-c212-4f8e-9266-ffb68a041ef6","Type":"ContainerStarted","Data":"8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974"} Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.521388 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.522080 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1d98-account-create-update-vvbxv" event={"ID":"b0b66f79-c7ee-40c3-a026-0c42a0648f11","Type":"ContainerDied","Data":"6e30842eb23ce669217afe90b8471485e9ff3f11e5eda2fc364c8b16bb1d9def"} Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.522109 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e30842eb23ce669217afe90b8471485e9ff3f11e5eda2fc364c8b16bb1d9def" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.522497 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7hzv\" (UniqueName: \"kubernetes.io/projected/394a21a5-81ce-4b43-8642-70a03a4a0685-kube-api-access-s7hzv\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.522515 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3116dd2a-d2d0-46cf-837d-56d29a7e116f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.522524 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/394a21a5-81ce-4b43-8642-70a03a4a0685-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.522534 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630fc775-bda7-45ac-9852-650855479072-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.522543 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm5rg\" (UniqueName: \"kubernetes.io/projected/3116dd2a-d2d0-46cf-837d-56d29a7e116f-kube-api-access-mm5rg\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.522553 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqdkm\" (UniqueName: \"kubernetes.io/projected/630fc775-bda7-45ac-9852-650855479072-kube-api-access-nqdkm\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.524336 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.524312825 podStartE2EDuration="3.524312825s" podCreationTimestamp="2026-03-20 16:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:55.511386201 +0000 UTC m=+1330.185722916" watchObservedRunningTime="2026-03-20 16:22:55.524312825 +0000 UTC m=+1330.198649560" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.532618 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qc5ff" event={"ID":"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc","Type":"ContainerDied","Data":"5cda86dd894e99a399e22155ff03b2fa23ce5df91932ea94d83976ad106bc3a1"} Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.532656 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cda86dd894e99a399e22155ff03b2fa23ce5df91932ea94d83976ad106bc3a1" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.559313 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2vzqh" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.587698 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qc5ff" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.735398 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f246f8c-2e08-400c-af52-746be688f708-operator-scripts\") pod \"3f246f8c-2e08-400c-af52-746be688f708\" (UID: \"3f246f8c-2e08-400c-af52-746be688f708\") " Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.735450 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb6jc\" (UniqueName: \"kubernetes.io/projected/3f246f8c-2e08-400c-af52-746be688f708-kube-api-access-gb6jc\") pod \"3f246f8c-2e08-400c-af52-746be688f708\" (UID: \"3f246f8c-2e08-400c-af52-746be688f708\") " Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.735534 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc-operator-scripts\") pod \"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc\" (UID: \"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc\") " Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.735584 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52qh2\" (UniqueName: \"kubernetes.io/projected/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc-kube-api-access-52qh2\") pod \"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc\" (UID: \"22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc\") " Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.736382 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f246f8c-2e08-400c-af52-746be688f708-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f246f8c-2e08-400c-af52-746be688f708" (UID: "3f246f8c-2e08-400c-af52-746be688f708"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.737352 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc" (UID: "22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.740902 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc-kube-api-access-52qh2" (OuterVolumeSpecName: "kube-api-access-52qh2") pod "22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc" (UID: "22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc"). InnerVolumeSpecName "kube-api-access-52qh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.744370 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f246f8c-2e08-400c-af52-746be688f708-kube-api-access-gb6jc" (OuterVolumeSpecName: "kube-api-access-gb6jc") pod "3f246f8c-2e08-400c-af52-746be688f708" (UID: "3f246f8c-2e08-400c-af52-746be688f708"). InnerVolumeSpecName "kube-api-access-gb6jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.837328 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f246f8c-2e08-400c-af52-746be688f708-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.837359 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb6jc\" (UniqueName: \"kubernetes.io/projected/3f246f8c-2e08-400c-af52-746be688f708-kube-api-access-gb6jc\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.837371 4708 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:55 crc kubenswrapper[4708]: I0320 16:22:55.837382 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52qh2\" (UniqueName: \"kubernetes.io/projected/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc-kube-api-access-52qh2\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:56 crc kubenswrapper[4708]: I0320 16:22:56.543961 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2747a4ec-c212-4f8e-9266-ffb68a041ef6","Type":"ContainerStarted","Data":"d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4"} Mar 20 16:22:56 crc kubenswrapper[4708]: I0320 16:22:56.544231 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2747a4ec-c212-4f8e-9266-ffb68a041ef6","Type":"ContainerStarted","Data":"258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0"} Mar 20 16:22:56 crc kubenswrapper[4708]: I0320 16:22:56.545312 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c01a244f-263d-435a-90d8-35bdb111bb6a","Type":"ContainerStarted","Data":"8bdfd6b6b156d8348d40108ec2ffb7427e1e7702b91d59516a9d97ba9acb1e27"} Mar 20 16:22:56 crc kubenswrapper[4708]: I0320 16:22:56.545429 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c01a244f-263d-435a-90d8-35bdb111bb6a","Type":"ContainerStarted","Data":"6bd84dcfaf9f583c3be8fe5e123d222014a9b00170659d69c5aaf5ba1229e1eb"} Mar 20 16:22:56 crc kubenswrapper[4708]: I0320 16:22:56.545442 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qc5ff" Mar 20 16:22:56 crc kubenswrapper[4708]: I0320 16:22:56.545474 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2vzqh" Mar 20 16:22:56 crc kubenswrapper[4708]: I0320 16:22:56.575350 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.575329568 podStartE2EDuration="3.575329568s" podCreationTimestamp="2026-03-20 16:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:22:56.568202673 +0000 UTC m=+1331.242539388" watchObservedRunningTime="2026-03-20 16:22:56.575329568 +0000 UTC m=+1331.249666283" Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.610753 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20739bf7_b966_4dbd_8846_4bda838c5da4.slice/crio-f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a.scope WatchSource:0}: Error finding container f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a: Status 404 returned error can't find the container with id f3ffbaf80983c2bdba1b6093e630e29d6e24934b2d730bee281e72af465d726a Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.611621 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod630fc775_bda7_45ac_9852_650855479072.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod630fc775_bda7_45ac_9852_650855479072.slice: no such file or directory Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.611657 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3116dd2a_d2d0_46cf_837d_56d29a7e116f.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3116dd2a_d2d0_46cf_837d_56d29a7e116f.slice: no such file or directory Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.611715 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0b66f79_c7ee_40c3_a026_0c42a0648f11.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0b66f79_c7ee_40c3_a026_0c42a0648f11.slice: no such file or directory Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.611768 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22dccba0_f2ba_4c32_8d42_a6fa8dd9fefc.slice/crio-5cda86dd894e99a399e22155ff03b2fa23ce5df91932ea94d83976ad106bc3a1": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22dccba0_f2ba_4c32_8d42_a6fa8dd9fefc.slice/crio-5cda86dd894e99a399e22155ff03b2fa23ce5df91932ea94d83976ad106bc3a1: no such file or directory Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.611784 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f246f8c_2e08_400c_af52_746be688f708.slice/crio-d0fb23daa0217fe61d433c5848b140ca0dc3937a13f6a7236a57bbca846e28f1": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f246f8c_2e08_400c_af52_746be688f708.slice/crio-d0fb23daa0217fe61d433c5848b140ca0dc3937a13f6a7236a57bbca846e28f1: no such file or directory Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.611802 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod394a21a5_81ce_4b43_8642_70a03a4a0685.slice/crio-679b487bf3840839d41bd9c30760c28a8920609d2567fe352aea2f7b8faf3e1b": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod394a21a5_81ce_4b43_8642_70a03a4a0685.slice/crio-679b487bf3840839d41bd9c30760c28a8920609d2567fe352aea2f7b8faf3e1b: no such file or directory Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.611824 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22dccba0_f2ba_4c32_8d42_a6fa8dd9fefc.slice/crio-conmon-687b9c422e841e69520bc01f976381a4e90d6d634600469ca2f0b8c4d8dfdcf9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22dccba0_f2ba_4c32_8d42_a6fa8dd9fefc.slice/crio-conmon-687b9c422e841e69520bc01f976381a4e90d6d634600469ca2f0b8c4d8dfdcf9.scope: no such file or directory Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.611942 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod394a21a5_81ce_4b43_8642_70a03a4a0685.slice/crio-conmon-01fc1382778da1917738bf00c90b8ce2d1e7ef86718a5766543790d531d34d0a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod394a21a5_81ce_4b43_8642_70a03a4a0685.slice/crio-conmon-01fc1382778da1917738bf00c90b8ce2d1e7ef86718a5766543790d531d34d0a.scope: no such file or directory Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.611958 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f246f8c_2e08_400c_af52_746be688f708.slice/crio-conmon-4fdeea389bf333390f5cf82b8084e149563364b6d640b9454792c9f0e2458f82.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f246f8c_2e08_400c_af52_746be688f708.slice/crio-conmon-4fdeea389bf333390f5cf82b8084e149563364b6d640b9454792c9f0e2458f82.scope: no such file or directory Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.612050 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22dccba0_f2ba_4c32_8d42_a6fa8dd9fefc.slice/crio-687b9c422e841e69520bc01f976381a4e90d6d634600469ca2f0b8c4d8dfdcf9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22dccba0_f2ba_4c32_8d42_a6fa8dd9fefc.slice/crio-687b9c422e841e69520bc01f976381a4e90d6d634600469ca2f0b8c4d8dfdcf9.scope: no such file or directory Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.612063 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod394a21a5_81ce_4b43_8642_70a03a4a0685.slice/crio-01fc1382778da1917738bf00c90b8ce2d1e7ef86718a5766543790d531d34d0a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod394a21a5_81ce_4b43_8642_70a03a4a0685.slice/crio-01fc1382778da1917738bf00c90b8ce2d1e7ef86718a5766543790d531d34d0a.scope: no such file or directory Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.612078 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f246f8c_2e08_400c_af52_746be688f708.slice/crio-4fdeea389bf333390f5cf82b8084e149563364b6d640b9454792c9f0e2458f82.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f246f8c_2e08_400c_af52_746be688f708.slice/crio-4fdeea389bf333390f5cf82b8084e149563364b6d640b9454792c9f0e2458f82.scope: no such file or directory Mar 20 16:22:57 crc kubenswrapper[4708]: W0320 16:22:57.612988 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20739bf7_b966_4dbd_8846_4bda838c5da4.slice/crio-815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524.scope WatchSource:0}: Error finding container 815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524: Status 404 returned error can't find the container with id 815c263c2b2a02da35a5754b6daf85d7ea91b04d96e935f32defc6a432e2c524 Mar 20 16:22:57 crc kubenswrapper[4708]: E0320 16:22:57.920217 4708 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d2a3b0e_e329_43b5_937a_107b0bea8941.slice/crio-7c30406650e08d2764e1990ec20d4d84d89333a4fa6b608b8064ab4c0d12493b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48bfe1e4_0a34_4af1_badd_c445d2c02ce1.slice/crio-conmon-79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f246f8c_2e08_400c_af52_746be688f708.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d2a3b0e_e329_43b5_937a_107b0bea8941.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22dccba0_f2ba_4c32_8d42_a6fa8dd9fefc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod394a21a5_81ce_4b43_8642_70a03a4a0685.slice\": RecentStats: unable to find data in memory cache]" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.111927 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.200342 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-horizon-tls-certs\") pod \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.200457 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-combined-ca-bundle\") pod \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.200482 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-logs\") pod \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.200549 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-horizon-secret-key\") pod \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.200589 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlpj7\" (UniqueName: \"kubernetes.io/projected/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-kube-api-access-xlpj7\") pod \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.200766 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-scripts\") pod \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.200807 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-config-data\") pod \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\" (UID: \"48bfe1e4-0a34-4af1-badd-c445d2c02ce1\") " Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.203368 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-logs" (OuterVolumeSpecName: "logs") pod "48bfe1e4-0a34-4af1-badd-c445d2c02ce1" (UID: "48bfe1e4-0a34-4af1-badd-c445d2c02ce1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.210211 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-kube-api-access-xlpj7" (OuterVolumeSpecName: "kube-api-access-xlpj7") pod "48bfe1e4-0a34-4af1-badd-c445d2c02ce1" (UID: "48bfe1e4-0a34-4af1-badd-c445d2c02ce1"). InnerVolumeSpecName "kube-api-access-xlpj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.225820 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "48bfe1e4-0a34-4af1-badd-c445d2c02ce1" (UID: "48bfe1e4-0a34-4af1-badd-c445d2c02ce1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.250998 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48bfe1e4-0a34-4af1-badd-c445d2c02ce1" (UID: "48bfe1e4-0a34-4af1-badd-c445d2c02ce1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.257450 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-scripts" (OuterVolumeSpecName: "scripts") pod "48bfe1e4-0a34-4af1-badd-c445d2c02ce1" (UID: "48bfe1e4-0a34-4af1-badd-c445d2c02ce1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.265709 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-config-data" (OuterVolumeSpecName: "config-data") pod "48bfe1e4-0a34-4af1-badd-c445d2c02ce1" (UID: "48bfe1e4-0a34-4af1-badd-c445d2c02ce1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.307834 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.308598 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.308746 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.308856 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.308938 4708 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.309018 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlpj7\" (UniqueName: \"kubernetes.io/projected/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-kube-api-access-xlpj7\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.307937 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "48bfe1e4-0a34-4af1-badd-c445d2c02ce1" (UID: "48bfe1e4-0a34-4af1-badd-c445d2c02ce1"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.412047 4708 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/48bfe1e4-0a34-4af1-badd-c445d2c02ce1-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.565552 4708 generic.go:334] "Generic (PLEG): container finished" podID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerID="79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e" exitCode=137 Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.565599 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b6ff5cbbd-kjfxp" event={"ID":"48bfe1e4-0a34-4af1-badd-c445d2c02ce1","Type":"ContainerDied","Data":"79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e"} Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.565627 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b6ff5cbbd-kjfxp" event={"ID":"48bfe1e4-0a34-4af1-badd-c445d2c02ce1","Type":"ContainerDied","Data":"e08be265e3cb151fb776664f40c15117f7fa813cab9614a156275362d8f4bc6c"} Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.565648 4708 scope.go:117] "RemoveContainer" containerID="ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.565873 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b6ff5cbbd-kjfxp" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.621721 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b6ff5cbbd-kjfxp"] Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.637389 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6b6ff5cbbd-kjfxp"] Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.736066 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.784968 4708 scope.go:117] "RemoveContainer" containerID="79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.814230 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65454bf644-7xssx" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.816416 4708 scope.go:117] "RemoveContainer" containerID="ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c" Mar 20 16:22:58 crc kubenswrapper[4708]: E0320 16:22:58.817035 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c\": container with ID starting with ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c not found: ID does not exist" containerID="ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.817069 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c"} err="failed to get container status \"ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c\": rpc error: code = NotFound desc = could not find container \"ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c\": container with ID starting with ce1987e845794087dd1f64a189475b8665555dca95874f10d09d39fde00e1d2c not found: ID does not exist" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.817095 4708 scope.go:117] "RemoveContainer" containerID="79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e" Mar 20 16:22:58 crc kubenswrapper[4708]: E0320 16:22:58.818053 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e\": container with ID starting with 79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e not found: ID does not exist" containerID="79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.818098 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e"} err="failed to get container status \"79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e\": rpc error: code = NotFound desc = could not find container \"79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e\": container with ID starting with 79750d276d8a96788b9a7c24b9d880d15ceca7b729e5336ab52121c58c5aeb3e not found: ID does not exist" Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.903330 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-798d756d44-mhhzm"] Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.903573 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-798d756d44-mhhzm" podUID="06b07110-3c5e-476a-9ff2-a460d043afc4" containerName="placement-log" containerID="cri-o://31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16" gracePeriod=30 Mar 20 16:22:58 crc kubenswrapper[4708]: I0320 16:22:58.903724 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-798d756d44-mhhzm" podUID="06b07110-3c5e-476a-9ff2-a460d043afc4" containerName="placement-api" containerID="cri-o://a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729" gracePeriod=30 Mar 20 16:22:59 crc kubenswrapper[4708]: I0320 16:22:59.575055 4708 generic.go:334] "Generic (PLEG): container finished" podID="06b07110-3c5e-476a-9ff2-a460d043afc4" containerID="31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16" exitCode=143 Mar 20 16:22:59 crc kubenswrapper[4708]: I0320 16:22:59.575122 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-798d756d44-mhhzm" event={"ID":"06b07110-3c5e-476a-9ff2-a460d043afc4","Type":"ContainerDied","Data":"31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16"} Mar 20 16:22:59 crc kubenswrapper[4708]: I0320 16:22:59.579123 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2747a4ec-c212-4f8e-9266-ffb68a041ef6","Type":"ContainerStarted","Data":"df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634"} Mar 20 16:22:59 crc kubenswrapper[4708]: I0320 16:22:59.579356 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="ceilometer-central-agent" containerID="cri-o://8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974" gracePeriod=30 Mar 20 16:22:59 crc kubenswrapper[4708]: I0320 16:22:59.579399 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="proxy-httpd" containerID="cri-o://df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634" gracePeriod=30 Mar 20 16:22:59 crc kubenswrapper[4708]: I0320 16:22:59.579412 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="sg-core" containerID="cri-o://d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4" gracePeriod=30 Mar 20 16:22:59 crc kubenswrapper[4708]: I0320 16:22:59.579432 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="ceilometer-notification-agent" containerID="cri-o://258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0" gracePeriod=30 Mar 20 16:22:59 crc kubenswrapper[4708]: I0320 16:22:59.608259 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.741856725 podStartE2EDuration="7.608240608s" podCreationTimestamp="2026-03-20 16:22:52 +0000 UTC" firstStartedPulling="2026-03-20 16:22:54.033549704 +0000 UTC m=+1328.707886419" lastFinishedPulling="2026-03-20 16:22:58.899933587 +0000 UTC m=+1333.574270302" observedRunningTime="2026-03-20 16:22:59.601032891 +0000 UTC m=+1334.275369606" watchObservedRunningTime="2026-03-20 16:22:59.608240608 +0000 UTC m=+1334.282577323" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.084594 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dkqq"] Mar 20 16:23:00 crc kubenswrapper[4708]: E0320 16:23:00.085267 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085281 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon" Mar 20 16:23:00 crc kubenswrapper[4708]: E0320 16:23:00.085291 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon-log" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085297 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon-log" Mar 20 16:23:00 crc kubenswrapper[4708]: E0320 16:23:00.085307 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f246f8c-2e08-400c-af52-746be688f708" containerName="mariadb-database-create" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085313 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f246f8c-2e08-400c-af52-746be688f708" containerName="mariadb-database-create" Mar 20 16:23:00 crc kubenswrapper[4708]: E0320 16:23:00.085322 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394a21a5-81ce-4b43-8642-70a03a4a0685" containerName="mariadb-account-create-update" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085328 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="394a21a5-81ce-4b43-8642-70a03a4a0685" containerName="mariadb-account-create-update" Mar 20 16:23:00 crc kubenswrapper[4708]: E0320 16:23:00.085339 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630fc775-bda7-45ac-9852-650855479072" containerName="mariadb-database-create" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085344 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="630fc775-bda7-45ac-9852-650855479072" containerName="mariadb-database-create" Mar 20 16:23:00 crc kubenswrapper[4708]: E0320 16:23:00.085372 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3116dd2a-d2d0-46cf-837d-56d29a7e116f" containerName="mariadb-account-create-update" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085377 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="3116dd2a-d2d0-46cf-837d-56d29a7e116f" containerName="mariadb-account-create-update" Mar 20 16:23:00 crc kubenswrapper[4708]: E0320 16:23:00.085388 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b66f79-c7ee-40c3-a026-0c42a0648f11" containerName="mariadb-account-create-update" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085394 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b66f79-c7ee-40c3-a026-0c42a0648f11" containerName="mariadb-account-create-update" Mar 20 16:23:00 crc kubenswrapper[4708]: E0320 16:23:00.085406 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc" containerName="mariadb-database-create" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085411 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc" containerName="mariadb-database-create" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085578 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc" containerName="mariadb-database-create" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085597 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="394a21a5-81ce-4b43-8642-70a03a4a0685" containerName="mariadb-account-create-update" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085605 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085613 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b66f79-c7ee-40c3-a026-0c42a0648f11" containerName="mariadb-account-create-update" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085624 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="630fc775-bda7-45ac-9852-650855479072" containerName="mariadb-database-create" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085636 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="3116dd2a-d2d0-46cf-837d-56d29a7e116f" containerName="mariadb-account-create-update" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085642 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" containerName="horizon-log" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.085651 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f246f8c-2e08-400c-af52-746be688f708" containerName="mariadb-database-create" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.086321 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.088720 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.089041 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.089713 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q6wtg" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.101874 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dkqq"] Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.130448 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48bfe1e4-0a34-4af1-badd-c445d2c02ce1" path="/var/lib/kubelet/pods/48bfe1e4-0a34-4af1-badd-c445d2c02ce1/volumes" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.247194 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-config-data\") pod \"nova-cell0-conductor-db-sync-5dkqq\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.247242 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-scripts\") pod \"nova-cell0-conductor-db-sync-5dkqq\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.247268 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5dkqq\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.247353 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g466x\" (UniqueName: \"kubernetes.io/projected/d679da4d-509e-49d9-a465-405bda8b3e2d-kube-api-access-g466x\") pod \"nova-cell0-conductor-db-sync-5dkqq\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.349921 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g466x\" (UniqueName: \"kubernetes.io/projected/d679da4d-509e-49d9-a465-405bda8b3e2d-kube-api-access-g466x\") pod \"nova-cell0-conductor-db-sync-5dkqq\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.350110 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-config-data\") pod \"nova-cell0-conductor-db-sync-5dkqq\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.350141 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-scripts\") pod \"nova-cell0-conductor-db-sync-5dkqq\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.350173 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5dkqq\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.357765 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-config-data\") pod \"nova-cell0-conductor-db-sync-5dkqq\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.364779 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5dkqq\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.371418 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-scripts\") pod \"nova-cell0-conductor-db-sync-5dkqq\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.371475 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g466x\" (UniqueName: \"kubernetes.io/projected/d679da4d-509e-49d9-a465-405bda8b3e2d-kube-api-access-g466x\") pod \"nova-cell0-conductor-db-sync-5dkqq\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.458294 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.604529 4708 generic.go:334] "Generic (PLEG): container finished" podID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerID="df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634" exitCode=0 Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.604568 4708 generic.go:334] "Generic (PLEG): container finished" podID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerID="d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4" exitCode=2 Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.604581 4708 generic.go:334] "Generic (PLEG): container finished" podID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerID="258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0" exitCode=0 Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.604598 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2747a4ec-c212-4f8e-9266-ffb68a041ef6","Type":"ContainerDied","Data":"df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634"} Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.604647 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2747a4ec-c212-4f8e-9266-ffb68a041ef6","Type":"ContainerDied","Data":"d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4"} Mar 20 16:23:00 crc kubenswrapper[4708]: I0320 16:23:00.604658 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2747a4ec-c212-4f8e-9266-ffb68a041ef6","Type":"ContainerDied","Data":"258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0"} Mar 20 16:23:01 crc kubenswrapper[4708]: I0320 16:23:01.074224 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dkqq"] Mar 20 16:23:01 crc kubenswrapper[4708]: I0320 16:23:01.322116 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 16:23:01 crc kubenswrapper[4708]: I0320 16:23:01.614183 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5dkqq" event={"ID":"d679da4d-509e-49d9-a465-405bda8b3e2d","Type":"ContainerStarted","Data":"4151b984872d1e107b510a453e18fc7336decfcf3ce0ef37e0b473d6148a6af3"} Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.588851 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.589204 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.611890 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.627435 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.639213 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.652722 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.654348 4708 generic.go:334] "Generic (PLEG): container finished" podID="06b07110-3c5e-476a-9ff2-a460d043afc4" containerID="a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729" exitCode=0 Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.654432 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-798d756d44-mhhzm" event={"ID":"06b07110-3c5e-476a-9ff2-a460d043afc4","Type":"ContainerDied","Data":"a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729"} Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.654466 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-798d756d44-mhhzm" event={"ID":"06b07110-3c5e-476a-9ff2-a460d043afc4","Type":"ContainerDied","Data":"05e153e621495e13c1164df458b65622618d75a9d3e0f4c763e1637e419babc5"} Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.654490 4708 scope.go:117] "RemoveContainer" containerID="a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.654650 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-798d756d44-mhhzm" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.680120 4708 generic.go:334] "Generic (PLEG): container finished" podID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerID="8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974" exitCode=0 Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.680195 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.680246 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2747a4ec-c212-4f8e-9266-ffb68a041ef6","Type":"ContainerDied","Data":"8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974"} Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.680278 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2747a4ec-c212-4f8e-9266-ffb68a041ef6","Type":"ContainerDied","Data":"54df14489dd4af3340f511ef38a58e4d186cda0ab130925bdc277f38781652b5"} Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.681947 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.682059 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.702099 4708 scope.go:117] "RemoveContainer" containerID="31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.729627 4708 scope.go:117] "RemoveContainer" containerID="a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729" Mar 20 16:23:02 crc kubenswrapper[4708]: E0320 16:23:02.731097 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729\": container with ID starting with a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729 not found: ID does not exist" containerID="a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.731138 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729"} err="failed to get container status \"a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729\": rpc error: code = NotFound desc = could not find container \"a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729\": container with ID starting with a17018a7a62a705e25c32747c9f8cea0ac49854df7f6d97f4d303211caa51729 not found: ID does not exist" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.731158 4708 scope.go:117] "RemoveContainer" containerID="31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16" Mar 20 16:23:02 crc kubenswrapper[4708]: E0320 16:23:02.731373 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16\": container with ID starting with 31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16 not found: ID does not exist" containerID="31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.731413 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16"} err="failed to get container status \"31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16\": rpc error: code = NotFound desc = could not find container \"31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16\": container with ID starting with 31c26ce4f215ef42ab29bb4a4bda90cd6764b04bd01e0475b14f9734310fdb16 not found: ID does not exist" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.731427 4708 scope.go:117] "RemoveContainer" containerID="df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.770932 4708 scope.go:117] "RemoveContainer" containerID="d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.803130 4708 scope.go:117] "RemoveContainer" containerID="258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807139 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-scripts\") pod \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807185 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-scripts\") pod \"06b07110-3c5e-476a-9ff2-a460d043afc4\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807252 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-sg-core-conf-yaml\") pod \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807297 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4m8z\" (UniqueName: \"kubernetes.io/projected/06b07110-3c5e-476a-9ff2-a460d043afc4-kube-api-access-s4m8z\") pod \"06b07110-3c5e-476a-9ff2-a460d043afc4\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807320 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-internal-tls-certs\") pod \"06b07110-3c5e-476a-9ff2-a460d043afc4\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807375 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-config-data\") pod \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807461 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-public-tls-certs\") pod \"06b07110-3c5e-476a-9ff2-a460d043afc4\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807526 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2747a4ec-c212-4f8e-9266-ffb68a041ef6-log-httpd\") pod \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807618 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-combined-ca-bundle\") pod \"06b07110-3c5e-476a-9ff2-a460d043afc4\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807703 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xjg6\" (UniqueName: \"kubernetes.io/projected/2747a4ec-c212-4f8e-9266-ffb68a041ef6-kube-api-access-5xjg6\") pod \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807723 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-combined-ca-bundle\") pod \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807790 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2747a4ec-c212-4f8e-9266-ffb68a041ef6-run-httpd\") pod \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\" (UID: \"2747a4ec-c212-4f8e-9266-ffb68a041ef6\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807824 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06b07110-3c5e-476a-9ff2-a460d043afc4-logs\") pod \"06b07110-3c5e-476a-9ff2-a460d043afc4\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.807960 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-config-data\") pod \"06b07110-3c5e-476a-9ff2-a460d043afc4\" (UID: \"06b07110-3c5e-476a-9ff2-a460d043afc4\") " Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.808460 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2747a4ec-c212-4f8e-9266-ffb68a041ef6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2747a4ec-c212-4f8e-9266-ffb68a041ef6" (UID: "2747a4ec-c212-4f8e-9266-ffb68a041ef6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.813019 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2747a4ec-c212-4f8e-9266-ffb68a041ef6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2747a4ec-c212-4f8e-9266-ffb68a041ef6" (UID: "2747a4ec-c212-4f8e-9266-ffb68a041ef6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.818065 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-scripts" (OuterVolumeSpecName: "scripts") pod "06b07110-3c5e-476a-9ff2-a460d043afc4" (UID: "06b07110-3c5e-476a-9ff2-a460d043afc4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.818348 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06b07110-3c5e-476a-9ff2-a460d043afc4-logs" (OuterVolumeSpecName: "logs") pod "06b07110-3c5e-476a-9ff2-a460d043afc4" (UID: "06b07110-3c5e-476a-9ff2-a460d043afc4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.839908 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2747a4ec-c212-4f8e-9266-ffb68a041ef6-kube-api-access-5xjg6" (OuterVolumeSpecName: "kube-api-access-5xjg6") pod "2747a4ec-c212-4f8e-9266-ffb68a041ef6" (UID: "2747a4ec-c212-4f8e-9266-ffb68a041ef6"). InnerVolumeSpecName "kube-api-access-5xjg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.843233 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b07110-3c5e-476a-9ff2-a460d043afc4-kube-api-access-s4m8z" (OuterVolumeSpecName: "kube-api-access-s4m8z") pod "06b07110-3c5e-476a-9ff2-a460d043afc4" (UID: "06b07110-3c5e-476a-9ff2-a460d043afc4"). InnerVolumeSpecName "kube-api-access-s4m8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.850815 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-scripts" (OuterVolumeSpecName: "scripts") pod "2747a4ec-c212-4f8e-9266-ffb68a041ef6" (UID: "2747a4ec-c212-4f8e-9266-ffb68a041ef6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.881848 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2747a4ec-c212-4f8e-9266-ffb68a041ef6" (UID: "2747a4ec-c212-4f8e-9266-ffb68a041ef6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.912033 4708 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2747a4ec-c212-4f8e-9266-ffb68a041ef6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.912072 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xjg6\" (UniqueName: \"kubernetes.io/projected/2747a4ec-c212-4f8e-9266-ffb68a041ef6-kube-api-access-5xjg6\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.912086 4708 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2747a4ec-c212-4f8e-9266-ffb68a041ef6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.912120 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06b07110-3c5e-476a-9ff2-a460d043afc4-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.912132 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.912141 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.912149 4708 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.912158 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4m8z\" (UniqueName: \"kubernetes.io/projected/06b07110-3c5e-476a-9ff2-a460d043afc4-kube-api-access-s4m8z\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.948522 4708 scope.go:117] "RemoveContainer" containerID="8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.960595 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06b07110-3c5e-476a-9ff2-a460d043afc4" (UID: "06b07110-3c5e-476a-9ff2-a460d043afc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.965584 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-config-data" (OuterVolumeSpecName: "config-data") pod "06b07110-3c5e-476a-9ff2-a460d043afc4" (UID: "06b07110-3c5e-476a-9ff2-a460d043afc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.986765 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "06b07110-3c5e-476a-9ff2-a460d043afc4" (UID: "06b07110-3c5e-476a-9ff2-a460d043afc4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:02 crc kubenswrapper[4708]: I0320 16:23:02.997848 4708 scope.go:117] "RemoveContainer" containerID="df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634" Mar 20 16:23:03 crc kubenswrapper[4708]: E0320 16:23:02.999445 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634\": container with ID starting with df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634 not found: ID does not exist" containerID="df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:02.999496 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634"} err="failed to get container status \"df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634\": rpc error: code = NotFound desc = could not find container \"df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634\": container with ID starting with df45407d8d922e02ae168b48a5476fa73d0bb117e53dd9723fb6391ca6941634 not found: ID does not exist" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:02.999525 4708 scope.go:117] "RemoveContainer" containerID="d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4" Mar 20 16:23:03 crc kubenswrapper[4708]: E0320 16:23:02.999836 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4\": container with ID starting with d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4 not found: ID does not exist" containerID="d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:02.999896 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4"} err="failed to get container status \"d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4\": rpc error: code = NotFound desc = could not find container \"d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4\": container with ID starting with d59997a2588293b7b469e5f76c68125ef30d9da2259e720a7d330d86f8f12ad4 not found: ID does not exist" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:02.999920 4708 scope.go:117] "RemoveContainer" containerID="258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0" Mar 20 16:23:03 crc kubenswrapper[4708]: E0320 16:23:03.000145 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0\": container with ID starting with 258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0 not found: ID does not exist" containerID="258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.000173 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0"} err="failed to get container status \"258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0\": rpc error: code = NotFound desc = could not find container \"258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0\": container with ID starting with 258942a29162c1111a17a6581b63b696ea709489b3fda61994496e8c50314cc0 not found: ID does not exist" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.000187 4708 scope.go:117] "RemoveContainer" containerID="8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974" Mar 20 16:23:03 crc kubenswrapper[4708]: E0320 16:23:03.000626 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974\": container with ID starting with 8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974 not found: ID does not exist" containerID="8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.000655 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974"} err="failed to get container status \"8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974\": rpc error: code = NotFound desc = could not find container \"8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974\": container with ID starting with 8e9000bc884deda9087ba145196b5ebf13e0784d77121b1a66edd80874e44974 not found: ID does not exist" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.001303 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2747a4ec-c212-4f8e-9266-ffb68a041ef6" (UID: "2747a4ec-c212-4f8e-9266-ffb68a041ef6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.013972 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.014002 4708 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.014013 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.014023 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.056639 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "06b07110-3c5e-476a-9ff2-a460d043afc4" (UID: "06b07110-3c5e-476a-9ff2-a460d043afc4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.070438 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-config-data" (OuterVolumeSpecName: "config-data") pod "2747a4ec-c212-4f8e-9266-ffb68a041ef6" (UID: "2747a4ec-c212-4f8e-9266-ffb68a041ef6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.115251 4708 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06b07110-3c5e-476a-9ff2-a460d043afc4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.115295 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2747a4ec-c212-4f8e-9266-ffb68a041ef6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.317784 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-798d756d44-mhhzm"] Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.328474 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-798d756d44-mhhzm"] Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.365902 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.388042 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.433831 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:03 crc kubenswrapper[4708]: E0320 16:23:03.435126 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="ceilometer-central-agent" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.435154 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="ceilometer-central-agent" Mar 20 16:23:03 crc kubenswrapper[4708]: E0320 16:23:03.435181 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="proxy-httpd" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.435192 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="proxy-httpd" Mar 20 16:23:03 crc kubenswrapper[4708]: E0320 16:23:03.435224 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b07110-3c5e-476a-9ff2-a460d043afc4" containerName="placement-api" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.435235 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b07110-3c5e-476a-9ff2-a460d043afc4" containerName="placement-api" Mar 20 16:23:03 crc kubenswrapper[4708]: E0320 16:23:03.435261 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b07110-3c5e-476a-9ff2-a460d043afc4" containerName="placement-log" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.435270 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b07110-3c5e-476a-9ff2-a460d043afc4" containerName="placement-log" Mar 20 16:23:03 crc kubenswrapper[4708]: E0320 16:23:03.435286 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="ceilometer-notification-agent" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.435296 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="ceilometer-notification-agent" Mar 20 16:23:03 crc kubenswrapper[4708]: E0320 16:23:03.435331 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="sg-core" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.435346 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="sg-core" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.436117 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="ceilometer-notification-agent" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.436161 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b07110-3c5e-476a-9ff2-a460d043afc4" containerName="placement-log" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.436183 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="proxy-httpd" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.436203 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="sg-core" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.436227 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b07110-3c5e-476a-9ff2-a460d043afc4" containerName="placement-api" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.436246 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" containerName="ceilometer-central-agent" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.441615 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.445382 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.445546 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.473246 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.627400 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-log-httpd\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.627473 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.627744 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-scripts\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.627915 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ltzw\" (UniqueName: \"kubernetes.io/projected/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-kube-api-access-2ltzw\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.628321 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-run-httpd\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.628526 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-config-data\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.628563 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.730651 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-config-data\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.730711 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.730769 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-log-httpd\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.730801 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.730830 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-scripts\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.730855 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ltzw\" (UniqueName: \"kubernetes.io/projected/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-kube-api-access-2ltzw\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.730930 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-run-httpd\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.731259 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-log-httpd\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.732015 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-run-httpd\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.736203 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-scripts\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.737185 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-config-data\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.738250 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.740040 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.753824 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ltzw\" (UniqueName: \"kubernetes.io/projected/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-kube-api-access-2ltzw\") pod \"ceilometer-0\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.774281 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.837784 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.837955 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.887905 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:03 crc kubenswrapper[4708]: I0320 16:23:03.908986 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:04 crc kubenswrapper[4708]: I0320 16:23:04.124116 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b07110-3c5e-476a-9ff2-a460d043afc4" path="/var/lib/kubelet/pods/06b07110-3c5e-476a-9ff2-a460d043afc4/volumes" Mar 20 16:23:04 crc kubenswrapper[4708]: I0320 16:23:04.125793 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2747a4ec-c212-4f8e-9266-ffb68a041ef6" path="/var/lib/kubelet/pods/2747a4ec-c212-4f8e-9266-ffb68a041ef6/volumes" Mar 20 16:23:04 crc kubenswrapper[4708]: I0320 16:23:04.319495 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:04 crc kubenswrapper[4708]: W0320 16:23:04.336883 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ef7cec_7d33_46e1_9bdd_26e07ef8cec1.slice/crio-7e22ffae1b9012a3e297bf1479b1dd45261f2e7924f4e163d3b370e9d6919f8f WatchSource:0}: Error finding container 7e22ffae1b9012a3e297bf1479b1dd45261f2e7924f4e163d3b370e9d6919f8f: Status 404 returned error can't find the container with id 7e22ffae1b9012a3e297bf1479b1dd45261f2e7924f4e163d3b370e9d6919f8f Mar 20 16:23:04 crc kubenswrapper[4708]: I0320 16:23:04.706235 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1","Type":"ContainerStarted","Data":"7e22ffae1b9012a3e297bf1479b1dd45261f2e7924f4e163d3b370e9d6919f8f"} Mar 20 16:23:04 crc kubenswrapper[4708]: I0320 16:23:04.706611 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:04 crc kubenswrapper[4708]: I0320 16:23:04.706637 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:04 crc kubenswrapper[4708]: I0320 16:23:04.994291 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 16:23:04 crc kubenswrapper[4708]: I0320 16:23:04.994760 4708 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:23:05 crc kubenswrapper[4708]: I0320 16:23:05.069808 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:05 crc kubenswrapper[4708]: I0320 16:23:05.311532 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 16:23:05 crc kubenswrapper[4708]: I0320 16:23:05.744796 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1","Type":"ContainerStarted","Data":"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99"} Mar 20 16:23:07 crc kubenswrapper[4708]: I0320 16:23:07.027452 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:07 crc kubenswrapper[4708]: I0320 16:23:07.027914 4708 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 16:23:07 crc kubenswrapper[4708]: I0320 16:23:07.033133 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 16:23:12 crc kubenswrapper[4708]: I0320 16:23:12.810817 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5dkqq" event={"ID":"d679da4d-509e-49d9-a465-405bda8b3e2d","Type":"ContainerStarted","Data":"83a49cf7103d63e64e2e66b7938372ce9a84e5e438e97eb4aaf4e992b32eab44"} Mar 20 16:23:12 crc kubenswrapper[4708]: I0320 16:23:12.812786 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1","Type":"ContainerStarted","Data":"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed"} Mar 20 16:23:12 crc kubenswrapper[4708]: I0320 16:23:12.831464 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5dkqq" podStartSLOduration=2.046975512 podStartE2EDuration="12.831444538s" podCreationTimestamp="2026-03-20 16:23:00 +0000 UTC" firstStartedPulling="2026-03-20 16:23:01.087782143 +0000 UTC m=+1335.762118858" lastFinishedPulling="2026-03-20 16:23:11.872251169 +0000 UTC m=+1346.546587884" observedRunningTime="2026-03-20 16:23:12.825224748 +0000 UTC m=+1347.499561473" watchObservedRunningTime="2026-03-20 16:23:12.831444538 +0000 UTC m=+1347.505781253" Mar 20 16:23:13 crc kubenswrapper[4708]: I0320 16:23:13.834236 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1","Type":"ContainerStarted","Data":"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33"} Mar 20 16:23:15 crc kubenswrapper[4708]: I0320 16:23:15.861697 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1","Type":"ContainerStarted","Data":"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf"} Mar 20 16:23:15 crc kubenswrapper[4708]: I0320 16:23:15.862459 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="ceilometer-central-agent" containerID="cri-o://0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99" gracePeriod=30 Mar 20 16:23:15 crc kubenswrapper[4708]: I0320 16:23:15.862778 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:23:15 crc kubenswrapper[4708]: I0320 16:23:15.863329 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="proxy-httpd" containerID="cri-o://b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf" gracePeriod=30 Mar 20 16:23:15 crc kubenswrapper[4708]: I0320 16:23:15.863384 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="sg-core" containerID="cri-o://6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33" gracePeriod=30 Mar 20 16:23:15 crc kubenswrapper[4708]: I0320 16:23:15.863423 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="ceilometer-notification-agent" containerID="cri-o://8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed" gracePeriod=30 Mar 20 16:23:15 crc kubenswrapper[4708]: I0320 16:23:15.897106 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.390282979 podStartE2EDuration="12.897087764s" podCreationTimestamp="2026-03-20 16:23:03 +0000 UTC" firstStartedPulling="2026-03-20 16:23:04.345946459 +0000 UTC m=+1339.020283174" lastFinishedPulling="2026-03-20 16:23:14.852751254 +0000 UTC m=+1349.527087959" observedRunningTime="2026-03-20 16:23:15.886933497 +0000 UTC m=+1350.561270212" watchObservedRunningTime="2026-03-20 16:23:15.897087764 +0000 UTC m=+1350.571424479" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.390139 4708 scope.go:117] "RemoveContainer" containerID="fb7ffcefa4b194620669bc9230869fe6380bd5b3ca53a6177b7844fdb5d9b92b" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.427810 4708 scope.go:117] "RemoveContainer" containerID="41c0141f0659169a510907b5d0cdf35946ea9dc813059f945277235896f3f7b1" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.487382 4708 scope.go:117] "RemoveContainer" containerID="0e2e7884c806a50bf0c5af599e085e057846747495a1ce12b240261d127a15dd" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.557961 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.633971 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ltzw\" (UniqueName: \"kubernetes.io/projected/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-kube-api-access-2ltzw\") pod \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.634037 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-config-data\") pod \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.634122 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-combined-ca-bundle\") pod \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.634154 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-run-httpd\") pod \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.634225 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-scripts\") pod \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.634265 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-log-httpd\") pod \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.634296 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-sg-core-conf-yaml\") pod \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\" (UID: \"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1\") " Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.635523 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" (UID: "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.635726 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" (UID: "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.638048 4708 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.638124 4708 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.640437 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-kube-api-access-2ltzw" (OuterVolumeSpecName: "kube-api-access-2ltzw") pod "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" (UID: "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1"). InnerVolumeSpecName "kube-api-access-2ltzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.640887 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-scripts" (OuterVolumeSpecName: "scripts") pod "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" (UID: "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.662711 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" (UID: "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.709631 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" (UID: "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.729876 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-config-data" (OuterVolumeSpecName: "config-data") pod "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" (UID: "41ef7cec-7d33-46e1-9bdd-26e07ef8cec1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.740287 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.740317 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.740327 4708 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.740338 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ltzw\" (UniqueName: \"kubernetes.io/projected/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-kube-api-access-2ltzw\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.740348 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.873620 4708 generic.go:334] "Generic (PLEG): container finished" podID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerID="b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf" exitCode=0 Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.873648 4708 generic.go:334] "Generic (PLEG): container finished" podID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerID="6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33" exitCode=2 Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.873657 4708 generic.go:334] "Generic (PLEG): container finished" podID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerID="8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed" exitCode=0 Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.873679 4708 generic.go:334] "Generic (PLEG): container finished" podID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerID="0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99" exitCode=0 Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.873711 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.873708 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1","Type":"ContainerDied","Data":"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf"} Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.873766 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1","Type":"ContainerDied","Data":"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33"} Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.873782 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1","Type":"ContainerDied","Data":"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed"} Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.873797 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1","Type":"ContainerDied","Data":"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99"} Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.873808 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41ef7cec-7d33-46e1-9bdd-26e07ef8cec1","Type":"ContainerDied","Data":"7e22ffae1b9012a3e297bf1479b1dd45261f2e7924f4e163d3b370e9d6919f8f"} Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.873850 4708 scope.go:117] "RemoveContainer" containerID="b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.902523 4708 scope.go:117] "RemoveContainer" containerID="6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.937243 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.951858 4708 scope.go:117] "RemoveContainer" containerID="8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.952036 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.961903 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:16 crc kubenswrapper[4708]: E0320 16:23:16.962462 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="proxy-httpd" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.962485 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="proxy-httpd" Mar 20 16:23:16 crc kubenswrapper[4708]: E0320 16:23:16.962512 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="sg-core" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.962519 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="sg-core" Mar 20 16:23:16 crc kubenswrapper[4708]: E0320 16:23:16.962538 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="ceilometer-central-agent" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.962546 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="ceilometer-central-agent" Mar 20 16:23:16 crc kubenswrapper[4708]: E0320 16:23:16.962554 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="ceilometer-notification-agent" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.962579 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="ceilometer-notification-agent" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.962821 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="ceilometer-notification-agent" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.962856 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="ceilometer-central-agent" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.962867 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="sg-core" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.962880 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" containerName="proxy-httpd" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.964921 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.971102 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.972886 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.974234 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:23:16 crc kubenswrapper[4708]: I0320 16:23:16.989059 4708 scope.go:117] "RemoveContainer" containerID="0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.047726 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.048097 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.048193 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-run-httpd\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.048260 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-log-httpd\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.048277 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-config-data\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.048374 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntbcn\" (UniqueName: \"kubernetes.io/projected/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-kube-api-access-ntbcn\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.048418 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-scripts\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.057763 4708 scope.go:117] "RemoveContainer" containerID="b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf" Mar 20 16:23:17 crc kubenswrapper[4708]: E0320 16:23:17.058295 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf\": container with ID starting with b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf not found: ID does not exist" containerID="b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.058329 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf"} err="failed to get container status \"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf\": rpc error: code = NotFound desc = could not find container \"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf\": container with ID starting with b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.058357 4708 scope.go:117] "RemoveContainer" containerID="6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33" Mar 20 16:23:17 crc kubenswrapper[4708]: E0320 16:23:17.058687 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33\": container with ID starting with 6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33 not found: ID does not exist" containerID="6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.058709 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33"} err="failed to get container status \"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33\": rpc error: code = NotFound desc = could not find container \"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33\": container with ID starting with 6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33 not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.058722 4708 scope.go:117] "RemoveContainer" containerID="8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed" Mar 20 16:23:17 crc kubenswrapper[4708]: E0320 16:23:17.059038 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed\": container with ID starting with 8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed not found: ID does not exist" containerID="8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.059066 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed"} err="failed to get container status \"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed\": rpc error: code = NotFound desc = could not find container \"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed\": container with ID starting with 8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.059078 4708 scope.go:117] "RemoveContainer" containerID="0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99" Mar 20 16:23:17 crc kubenswrapper[4708]: E0320 16:23:17.059306 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99\": container with ID starting with 0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99 not found: ID does not exist" containerID="0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.059327 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99"} err="failed to get container status \"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99\": rpc error: code = NotFound desc = could not find container \"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99\": container with ID starting with 0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99 not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.059341 4708 scope.go:117] "RemoveContainer" containerID="b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.059554 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf"} err="failed to get container status \"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf\": rpc error: code = NotFound desc = could not find container \"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf\": container with ID starting with b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.059582 4708 scope.go:117] "RemoveContainer" containerID="6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.059783 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33"} err="failed to get container status \"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33\": rpc error: code = NotFound desc = could not find container \"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33\": container with ID starting with 6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33 not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.059812 4708 scope.go:117] "RemoveContainer" containerID="8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.059997 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed"} err="failed to get container status \"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed\": rpc error: code = NotFound desc = could not find container \"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed\": container with ID starting with 8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.060029 4708 scope.go:117] "RemoveContainer" containerID="0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.060276 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99"} err="failed to get container status \"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99\": rpc error: code = NotFound desc = could not find container \"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99\": container with ID starting with 0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99 not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.060294 4708 scope.go:117] "RemoveContainer" containerID="b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.060529 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf"} err="failed to get container status \"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf\": rpc error: code = NotFound desc = could not find container \"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf\": container with ID starting with b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.060554 4708 scope.go:117] "RemoveContainer" containerID="6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.060784 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33"} err="failed to get container status \"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33\": rpc error: code = NotFound desc = could not find container \"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33\": container with ID starting with 6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33 not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.060808 4708 scope.go:117] "RemoveContainer" containerID="8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.060962 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed"} err="failed to get container status \"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed\": rpc error: code = NotFound desc = could not find container \"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed\": container with ID starting with 8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.060981 4708 scope.go:117] "RemoveContainer" containerID="0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.061174 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99"} err="failed to get container status \"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99\": rpc error: code = NotFound desc = could not find container \"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99\": container with ID starting with 0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99 not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.061195 4708 scope.go:117] "RemoveContainer" containerID="b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.061421 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf"} err="failed to get container status \"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf\": rpc error: code = NotFound desc = could not find container \"b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf\": container with ID starting with b5b6bb00408f9b71d503e918db242ba4dce34607778f5df3315b01d5630e1adf not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.061443 4708 scope.go:117] "RemoveContainer" containerID="6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.061695 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33"} err="failed to get container status \"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33\": rpc error: code = NotFound desc = could not find container \"6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33\": container with ID starting with 6f42df8517ee3166500a73699d75b200fdb6333d739fc7eb3866571c669e6a33 not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.061721 4708 scope.go:117] "RemoveContainer" containerID="8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.061942 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed"} err="failed to get container status \"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed\": rpc error: code = NotFound desc = could not find container \"8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed\": container with ID starting with 8a5c5ac2458db967972224d9de3b7b85442b471a1f7f6e5246d92694d7fa66ed not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.062312 4708 scope.go:117] "RemoveContainer" containerID="0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.062579 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99"} err="failed to get container status \"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99\": rpc error: code = NotFound desc = could not find container \"0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99\": container with ID starting with 0968b746f9723fac25f8304acb8dbac6457ddc1d92b0c848c3ba3b8a2fd6bb99 not found: ID does not exist" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.150260 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-scripts\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.150356 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.150401 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.150477 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-run-httpd\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.150577 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-log-httpd\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.150600 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-config-data\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.150723 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntbcn\" (UniqueName: \"kubernetes.io/projected/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-kube-api-access-ntbcn\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.151883 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-log-httpd\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.151890 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-run-httpd\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.155628 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-config-data\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.155764 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.156130 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-scripts\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.156821 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.169216 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntbcn\" (UniqueName: \"kubernetes.io/projected/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-kube-api-access-ntbcn\") pod \"ceilometer-0\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.349152 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.789416 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:17 crc kubenswrapper[4708]: W0320 16:23:17.792904 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde69fe5a_0a95_4387_9c43_4ff1fe53bad4.slice/crio-64be44ce3fef71f67903ba7ea130ea6eb2b7bc35180c8ea67812f2b9033a5f45 WatchSource:0}: Error finding container 64be44ce3fef71f67903ba7ea130ea6eb2b7bc35180c8ea67812f2b9033a5f45: Status 404 returned error can't find the container with id 64be44ce3fef71f67903ba7ea130ea6eb2b7bc35180c8ea67812f2b9033a5f45 Mar 20 16:23:17 crc kubenswrapper[4708]: I0320 16:23:17.884199 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de69fe5a-0a95-4387-9c43-4ff1fe53bad4","Type":"ContainerStarted","Data":"64be44ce3fef71f67903ba7ea130ea6eb2b7bc35180c8ea67812f2b9033a5f45"} Mar 20 16:23:18 crc kubenswrapper[4708]: I0320 16:23:18.122458 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ef7cec-7d33-46e1-9bdd-26e07ef8cec1" path="/var/lib/kubelet/pods/41ef7cec-7d33-46e1-9bdd-26e07ef8cec1/volumes" Mar 20 16:23:19 crc kubenswrapper[4708]: I0320 16:23:19.905348 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de69fe5a-0a95-4387-9c43-4ff1fe53bad4","Type":"ContainerStarted","Data":"b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15"} Mar 20 16:23:19 crc kubenswrapper[4708]: I0320 16:23:19.905961 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de69fe5a-0a95-4387-9c43-4ff1fe53bad4","Type":"ContainerStarted","Data":"5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f"} Mar 20 16:23:20 crc kubenswrapper[4708]: I0320 16:23:20.916734 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de69fe5a-0a95-4387-9c43-4ff1fe53bad4","Type":"ContainerStarted","Data":"d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd"} Mar 20 16:23:22 crc kubenswrapper[4708]: I0320 16:23:22.936891 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de69fe5a-0a95-4387-9c43-4ff1fe53bad4","Type":"ContainerStarted","Data":"63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0"} Mar 20 16:23:22 crc kubenswrapper[4708]: I0320 16:23:22.937248 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:23:22 crc kubenswrapper[4708]: I0320 16:23:22.959570 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.311625073 podStartE2EDuration="6.959553625s" podCreationTimestamp="2026-03-20 16:23:16 +0000 UTC" firstStartedPulling="2026-03-20 16:23:17.795293231 +0000 UTC m=+1352.469629946" lastFinishedPulling="2026-03-20 16:23:22.443221783 +0000 UTC m=+1357.117558498" observedRunningTime="2026-03-20 16:23:22.959455332 +0000 UTC m=+1357.633792057" watchObservedRunningTime="2026-03-20 16:23:22.959553625 +0000 UTC m=+1357.633890340" Mar 20 16:23:24 crc kubenswrapper[4708]: I0320 16:23:24.960995 4708 generic.go:334] "Generic (PLEG): container finished" podID="d679da4d-509e-49d9-a465-405bda8b3e2d" containerID="83a49cf7103d63e64e2e66b7938372ce9a84e5e438e97eb4aaf4e992b32eab44" exitCode=0 Mar 20 16:23:24 crc kubenswrapper[4708]: I0320 16:23:24.961081 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5dkqq" event={"ID":"d679da4d-509e-49d9-a465-405bda8b3e2d","Type":"ContainerDied","Data":"83a49cf7103d63e64e2e66b7938372ce9a84e5e438e97eb4aaf4e992b32eab44"} Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.178832 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.179440 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.357906 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.452850 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-config-data\") pod \"d679da4d-509e-49d9-a465-405bda8b3e2d\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.453398 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g466x\" (UniqueName: \"kubernetes.io/projected/d679da4d-509e-49d9-a465-405bda8b3e2d-kube-api-access-g466x\") pod \"d679da4d-509e-49d9-a465-405bda8b3e2d\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.453438 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-combined-ca-bundle\") pod \"d679da4d-509e-49d9-a465-405bda8b3e2d\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.453463 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-scripts\") pod \"d679da4d-509e-49d9-a465-405bda8b3e2d\" (UID: \"d679da4d-509e-49d9-a465-405bda8b3e2d\") " Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.463568 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-scripts" (OuterVolumeSpecName: "scripts") pod "d679da4d-509e-49d9-a465-405bda8b3e2d" (UID: "d679da4d-509e-49d9-a465-405bda8b3e2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.477100 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d679da4d-509e-49d9-a465-405bda8b3e2d-kube-api-access-g466x" (OuterVolumeSpecName: "kube-api-access-g466x") pod "d679da4d-509e-49d9-a465-405bda8b3e2d" (UID: "d679da4d-509e-49d9-a465-405bda8b3e2d"). InnerVolumeSpecName "kube-api-access-g466x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.489523 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-config-data" (OuterVolumeSpecName: "config-data") pod "d679da4d-509e-49d9-a465-405bda8b3e2d" (UID: "d679da4d-509e-49d9-a465-405bda8b3e2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.495554 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d679da4d-509e-49d9-a465-405bda8b3e2d" (UID: "d679da4d-509e-49d9-a465-405bda8b3e2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.555062 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.555107 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g466x\" (UniqueName: \"kubernetes.io/projected/d679da4d-509e-49d9-a465-405bda8b3e2d-kube-api-access-g466x\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.555123 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.555135 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d679da4d-509e-49d9-a465-405bda8b3e2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.986663 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5dkqq" event={"ID":"d679da4d-509e-49d9-a465-405bda8b3e2d","Type":"ContainerDied","Data":"4151b984872d1e107b510a453e18fc7336decfcf3ce0ef37e0b473d6148a6af3"} Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.986724 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4151b984872d1e107b510a453e18fc7336decfcf3ce0ef37e0b473d6148a6af3" Mar 20 16:23:26 crc kubenswrapper[4708]: I0320 16:23:26.986802 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5dkqq" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.109246 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 16:23:27 crc kubenswrapper[4708]: E0320 16:23:27.109703 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d679da4d-509e-49d9-a465-405bda8b3e2d" containerName="nova-cell0-conductor-db-sync" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.109721 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="d679da4d-509e-49d9-a465-405bda8b3e2d" containerName="nova-cell0-conductor-db-sync" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.109911 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="d679da4d-509e-49d9-a465-405bda8b3e2d" containerName="nova-cell0-conductor-db-sync" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.110892 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.113366 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q6wtg" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.113732 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.122149 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.170729 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clndf\" (UniqueName: \"kubernetes.io/projected/281ab006-f27d-4cb5-9d26-8fde6cc40ab2-kube-api-access-clndf\") pod \"nova-cell0-conductor-0\" (UID: \"281ab006-f27d-4cb5-9d26-8fde6cc40ab2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.170795 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281ab006-f27d-4cb5-9d26-8fde6cc40ab2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"281ab006-f27d-4cb5-9d26-8fde6cc40ab2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.170903 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281ab006-f27d-4cb5-9d26-8fde6cc40ab2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"281ab006-f27d-4cb5-9d26-8fde6cc40ab2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.273570 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clndf\" (UniqueName: \"kubernetes.io/projected/281ab006-f27d-4cb5-9d26-8fde6cc40ab2-kube-api-access-clndf\") pod \"nova-cell0-conductor-0\" (UID: \"281ab006-f27d-4cb5-9d26-8fde6cc40ab2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.273693 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281ab006-f27d-4cb5-9d26-8fde6cc40ab2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"281ab006-f27d-4cb5-9d26-8fde6cc40ab2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.273813 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281ab006-f27d-4cb5-9d26-8fde6cc40ab2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"281ab006-f27d-4cb5-9d26-8fde6cc40ab2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.280287 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281ab006-f27d-4cb5-9d26-8fde6cc40ab2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"281ab006-f27d-4cb5-9d26-8fde6cc40ab2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.280596 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281ab006-f27d-4cb5-9d26-8fde6cc40ab2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"281ab006-f27d-4cb5-9d26-8fde6cc40ab2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.297688 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clndf\" (UniqueName: \"kubernetes.io/projected/281ab006-f27d-4cb5-9d26-8fde6cc40ab2-kube-api-access-clndf\") pod \"nova-cell0-conductor-0\" (UID: \"281ab006-f27d-4cb5-9d26-8fde6cc40ab2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.431686 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:27 crc kubenswrapper[4708]: I0320 16:23:27.948481 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 16:23:28 crc kubenswrapper[4708]: I0320 16:23:28.044767 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"281ab006-f27d-4cb5-9d26-8fde6cc40ab2","Type":"ContainerStarted","Data":"f1ba6008d97b389c495f175d0a52188c868a7a13414bc7df145d8383cb298395"} Mar 20 16:23:29 crc kubenswrapper[4708]: I0320 16:23:29.053493 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"281ab006-f27d-4cb5-9d26-8fde6cc40ab2","Type":"ContainerStarted","Data":"88e301f9a580d9045ea770e8c68c657491248c7c73ac341e7345d7343a5d5afe"} Mar 20 16:23:29 crc kubenswrapper[4708]: I0320 16:23:29.053979 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:29 crc kubenswrapper[4708]: I0320 16:23:29.072444 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.072422269 podStartE2EDuration="2.072422269s" podCreationTimestamp="2026-03-20 16:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:29.066928209 +0000 UTC m=+1363.741264934" watchObservedRunningTime="2026-03-20 16:23:29.072422269 +0000 UTC m=+1363.746758984" Mar 20 16:23:37 crc kubenswrapper[4708]: I0320 16:23:37.473022 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 16:23:37 crc kubenswrapper[4708]: I0320 16:23:37.950266 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-2hpvr"] Mar 20 16:23:37 crc kubenswrapper[4708]: I0320 16:23:37.951473 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:37 crc kubenswrapper[4708]: I0320 16:23:37.957827 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 16:23:37 crc kubenswrapper[4708]: I0320 16:23:37.960271 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-2hpvr"] Mar 20 16:23:37 crc kubenswrapper[4708]: I0320 16:23:37.971607 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.005648 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-config-data\") pod \"nova-cell0-cell-mapping-2hpvr\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.005978 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-scripts\") pod \"nova-cell0-cell-mapping-2hpvr\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.006346 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjhh4\" (UniqueName: \"kubernetes.io/projected/af676e7f-5129-436c-9451-8a9b1c8c19c0-kube-api-access-kjhh4\") pod \"nova-cell0-cell-mapping-2hpvr\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.006393 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2hpvr\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.107808 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-scripts\") pod \"nova-cell0-cell-mapping-2hpvr\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.108195 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhh4\" (UniqueName: \"kubernetes.io/projected/af676e7f-5129-436c-9451-8a9b1c8c19c0-kube-api-access-kjhh4\") pod \"nova-cell0-cell-mapping-2hpvr\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.108218 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2hpvr\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.108875 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-config-data\") pod \"nova-cell0-cell-mapping-2hpvr\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.115382 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2hpvr\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.116119 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-config-data\") pod \"nova-cell0-cell-mapping-2hpvr\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.118167 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-scripts\") pod \"nova-cell0-cell-mapping-2hpvr\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.161348 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhh4\" (UniqueName: \"kubernetes.io/projected/af676e7f-5129-436c-9451-8a9b1c8c19c0-kube-api-access-kjhh4\") pod \"nova-cell0-cell-mapping-2hpvr\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.183388 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.185162 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.189168 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.195189 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.230741 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.232380 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.251186 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.265587 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.315280 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8464020-3eec-491d-b1ba-12afbf0b017c-config-data\") pod \"nova-scheduler-0\" (UID: \"a8464020-3eec-491d-b1ba-12afbf0b017c\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.315361 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.315416 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8464020-3eec-491d-b1ba-12afbf0b017c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a8464020-3eec-491d-b1ba-12afbf0b017c\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.315444 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-config-data\") pod \"nova-metadata-0\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.315479 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-logs\") pod \"nova-metadata-0\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.315515 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjbx5\" (UniqueName: \"kubernetes.io/projected/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-kube-api-access-vjbx5\") pod \"nova-metadata-0\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.315544 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8gsw\" (UniqueName: \"kubernetes.io/projected/a8464020-3eec-491d-b1ba-12afbf0b017c-kube-api-access-h8gsw\") pod \"nova-scheduler-0\" (UID: \"a8464020-3eec-491d-b1ba-12afbf0b017c\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.325370 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.329414 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7p9zm"] Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.332057 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.378570 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.384308 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.388459 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.423261 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbx5\" (UniqueName: \"kubernetes.io/projected/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-kube-api-access-vjbx5\") pod \"nova-metadata-0\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.423337 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8gsw\" (UniqueName: \"kubernetes.io/projected/a8464020-3eec-491d-b1ba-12afbf0b017c-kube-api-access-h8gsw\") pod \"nova-scheduler-0\" (UID: \"a8464020-3eec-491d-b1ba-12afbf0b017c\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.423552 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8464020-3eec-491d-b1ba-12afbf0b017c-config-data\") pod \"nova-scheduler-0\" (UID: \"a8464020-3eec-491d-b1ba-12afbf0b017c\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.423622 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.423702 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8464020-3eec-491d-b1ba-12afbf0b017c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a8464020-3eec-491d-b1ba-12afbf0b017c\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.423737 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-config-data\") pod \"nova-metadata-0\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.423779 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-logs\") pod \"nova-metadata-0\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.424276 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-logs\") pod \"nova-metadata-0\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.437682 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8464020-3eec-491d-b1ba-12afbf0b017c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a8464020-3eec-491d-b1ba-12afbf0b017c\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.447774 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.452166 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-config-data\") pod \"nova-metadata-0\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.458309 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8464020-3eec-491d-b1ba-12afbf0b017c-config-data\") pod \"nova-scheduler-0\" (UID: \"a8464020-3eec-491d-b1ba-12afbf0b017c\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.468281 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8gsw\" (UniqueName: \"kubernetes.io/projected/a8464020-3eec-491d-b1ba-12afbf0b017c-kube-api-access-h8gsw\") pod \"nova-scheduler-0\" (UID: \"a8464020-3eec-491d-b1ba-12afbf0b017c\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.474991 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbx5\" (UniqueName: \"kubernetes.io/projected/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-kube-api-access-vjbx5\") pod \"nova-metadata-0\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.484856 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7p9zm"] Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.526132 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-config\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.526184 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-dns-svc\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.526243 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.527861 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.527902 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.527989 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glmb2\" (UniqueName: \"kubernetes.io/projected/fc8feabe-d295-4e0a-b541-9bfd25628128-kube-api-access-glmb2\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.528017 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b6cd1a-61a4-403a-964e-f502cb67882f-config-data\") pod \"nova-api-0\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.528057 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b6cd1a-61a4-403a-964e-f502cb67882f-logs\") pod \"nova-api-0\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.528072 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtsfm\" (UniqueName: \"kubernetes.io/projected/b1b6cd1a-61a4-403a-964e-f502cb67882f-kube-api-access-wtsfm\") pod \"nova-api-0\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.528124 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b6cd1a-61a4-403a-964e-f502cb67882f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.558024 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.586848 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.596610 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.629887 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b6cd1a-61a4-403a-964e-f502cb67882f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.629938 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-config\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.629964 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-dns-svc\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.630014 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.630049 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.630073 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.630151 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glmb2\" (UniqueName: \"kubernetes.io/projected/fc8feabe-d295-4e0a-b541-9bfd25628128-kube-api-access-glmb2\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.630180 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b6cd1a-61a4-403a-964e-f502cb67882f-config-data\") pod \"nova-api-0\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.630222 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b6cd1a-61a4-403a-964e-f502cb67882f-logs\") pod \"nova-api-0\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.630244 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtsfm\" (UniqueName: \"kubernetes.io/projected/b1b6cd1a-61a4-403a-964e-f502cb67882f-kube-api-access-wtsfm\") pod \"nova-api-0\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.631252 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.633435 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.633610 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-dns-svc\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.633654 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-config\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.634372 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.635307 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.635362 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b6cd1a-61a4-403a-964e-f502cb67882f-logs\") pod \"nova-api-0\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.639588 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.644168 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.646223 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.646543 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b6cd1a-61a4-403a-964e-f502cb67882f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.646570 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b6cd1a-61a4-403a-964e-f502cb67882f-config-data\") pod \"nova-api-0\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.668443 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glmb2\" (UniqueName: \"kubernetes.io/projected/fc8feabe-d295-4e0a-b541-9bfd25628128-kube-api-access-glmb2\") pod \"dnsmasq-dns-757b4f8459-7p9zm\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.671243 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtsfm\" (UniqueName: \"kubernetes.io/projected/b1b6cd1a-61a4-403a-964e-f502cb67882f-kube-api-access-wtsfm\") pod \"nova-api-0\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.760245 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.804203 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.847950 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd133b9-300b-403d-a469-58790dee930c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ddd133b9-300b-403d-a469-58790dee930c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.848064 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd133b9-300b-403d-a469-58790dee930c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ddd133b9-300b-403d-a469-58790dee930c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.848136 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm6mk\" (UniqueName: \"kubernetes.io/projected/ddd133b9-300b-403d-a469-58790dee930c-kube-api-access-tm6mk\") pod \"nova-cell1-novncproxy-0\" (UID: \"ddd133b9-300b-403d-a469-58790dee930c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.950569 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd133b9-300b-403d-a469-58790dee930c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ddd133b9-300b-403d-a469-58790dee930c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.950652 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd133b9-300b-403d-a469-58790dee930c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ddd133b9-300b-403d-a469-58790dee930c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.950823 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm6mk\" (UniqueName: \"kubernetes.io/projected/ddd133b9-300b-403d-a469-58790dee930c-kube-api-access-tm6mk\") pod \"nova-cell1-novncproxy-0\" (UID: \"ddd133b9-300b-403d-a469-58790dee930c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.957333 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd133b9-300b-403d-a469-58790dee930c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ddd133b9-300b-403d-a469-58790dee930c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.964583 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd133b9-300b-403d-a469-58790dee930c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ddd133b9-300b-403d-a469-58790dee930c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.969863 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm6mk\" (UniqueName: \"kubernetes.io/projected/ddd133b9-300b-403d-a469-58790dee930c-kube-api-access-tm6mk\") pod \"nova-cell1-novncproxy-0\" (UID: \"ddd133b9-300b-403d-a469-58790dee930c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.971923 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:23:38 crc kubenswrapper[4708]: I0320 16:23:38.975840 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-2hpvr"] Mar 20 16:23:39 crc kubenswrapper[4708]: W0320 16:23:39.075250 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf676e7f_5129_436c_9451_8a9b1c8c19c0.slice/crio-84915d182a57c84c0d30885753a625a1d23becd15174c9ff41a9700696654d79 WatchSource:0}: Error finding container 84915d182a57c84c0d30885753a625a1d23becd15174c9ff41a9700696654d79: Status 404 returned error can't find the container with id 84915d182a57c84c0d30885753a625a1d23becd15174c9ff41a9700696654d79 Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.170021 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2hpvr" event={"ID":"af676e7f-5129-436c-9451-8a9b1c8c19c0","Type":"ContainerStarted","Data":"84915d182a57c84c0d30885753a625a1d23becd15174c9ff41a9700696654d79"} Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.196354 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpnrd"] Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.197623 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.199886 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.200486 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.209709 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpnrd"] Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.360413 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zpnrd\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.360994 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-scripts\") pod \"nova-cell1-conductor-db-sync-zpnrd\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.361021 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8cdx\" (UniqueName: \"kubernetes.io/projected/384f206f-1142-4027-90ef-7adfeda8a5f5-kube-api-access-d8cdx\") pod \"nova-cell1-conductor-db-sync-zpnrd\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.361156 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-config-data\") pod \"nova-cell1-conductor-db-sync-zpnrd\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.465693 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-config-data\") pod \"nova-cell1-conductor-db-sync-zpnrd\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.465801 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zpnrd\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.465901 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-scripts\") pod \"nova-cell1-conductor-db-sync-zpnrd\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.465921 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8cdx\" (UniqueName: \"kubernetes.io/projected/384f206f-1142-4027-90ef-7adfeda8a5f5-kube-api-access-d8cdx\") pod \"nova-cell1-conductor-db-sync-zpnrd\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.473238 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zpnrd\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.493219 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-scripts\") pod \"nova-cell1-conductor-db-sync-zpnrd\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.493654 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-config-data\") pod \"nova-cell1-conductor-db-sync-zpnrd\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.497738 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8cdx\" (UniqueName: \"kubernetes.io/projected/384f206f-1142-4027-90ef-7adfeda8a5f5-kube-api-access-d8cdx\") pod \"nova-cell1-conductor-db-sync-zpnrd\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.564493 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.619486 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7p9zm"] Mar 20 16:23:39 crc kubenswrapper[4708]: W0320 16:23:39.630828 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8464020_3eec_491d_b1ba_12afbf0b017c.slice/crio-be1757473f688a3e84fabe2608ebf0d9b5f6683b965f8d161d05624c58221e31 WatchSource:0}: Error finding container be1757473f688a3e84fabe2608ebf0d9b5f6683b965f8d161d05624c58221e31: Status 404 returned error can't find the container with id be1757473f688a3e84fabe2608ebf0d9b5f6683b965f8d161d05624c58221e31 Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.633811 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.639419 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.724819 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:23:39 crc kubenswrapper[4708]: I0320 16:23:39.743258 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:23:39 crc kubenswrapper[4708]: W0320 16:23:39.746965 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddd133b9_300b_403d_a469_58790dee930c.slice/crio-9bce4e37d21a9eec48cd80497d451eda4b688007ed81c6b810069d529a6cf571 WatchSource:0}: Error finding container 9bce4e37d21a9eec48cd80497d451eda4b688007ed81c6b810069d529a6cf571: Status 404 returned error can't find the container with id 9bce4e37d21a9eec48cd80497d451eda4b688007ed81c6b810069d529a6cf571 Mar 20 16:23:40 crc kubenswrapper[4708]: I0320 16:23:40.183629 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpnrd"] Mar 20 16:23:40 crc kubenswrapper[4708]: I0320 16:23:40.189851 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4","Type":"ContainerStarted","Data":"ec198aebba5f734e4a6d9ac66df948e2082b2f25c0802146c065a3c4a61d69c9"} Mar 20 16:23:40 crc kubenswrapper[4708]: I0320 16:23:40.193052 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ddd133b9-300b-403d-a469-58790dee930c","Type":"ContainerStarted","Data":"9bce4e37d21a9eec48cd80497d451eda4b688007ed81c6b810069d529a6cf571"} Mar 20 16:23:40 crc kubenswrapper[4708]: I0320 16:23:40.194998 4708 generic.go:334] "Generic (PLEG): container finished" podID="fc8feabe-d295-4e0a-b541-9bfd25628128" containerID="e2d337afd8cd334dc8e3e63e3bb1b44fe7ebfdb48b3138de04d17ec742db897a" exitCode=0 Mar 20 16:23:40 crc kubenswrapper[4708]: I0320 16:23:40.195042 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" event={"ID":"fc8feabe-d295-4e0a-b541-9bfd25628128","Type":"ContainerDied","Data":"e2d337afd8cd334dc8e3e63e3bb1b44fe7ebfdb48b3138de04d17ec742db897a"} Mar 20 16:23:40 crc kubenswrapper[4708]: I0320 16:23:40.195089 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" event={"ID":"fc8feabe-d295-4e0a-b541-9bfd25628128","Type":"ContainerStarted","Data":"3e3aeb71c3aa2c2d40861cba57ac5f17ee96f5a78c3e12b32b272182304e080b"} Mar 20 16:23:40 crc kubenswrapper[4708]: I0320 16:23:40.196442 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2hpvr" event={"ID":"af676e7f-5129-436c-9451-8a9b1c8c19c0","Type":"ContainerStarted","Data":"b1ec4d5b3497d1f22acdc9688aa9dfb36a8195784ada83e5b4f1e1bea401c955"} Mar 20 16:23:40 crc kubenswrapper[4708]: I0320 16:23:40.198600 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b1b6cd1a-61a4-403a-964e-f502cb67882f","Type":"ContainerStarted","Data":"69a9717d864b727278f50474a79cc92df62822441193fa75a3e13911cec35b4b"} Mar 20 16:23:40 crc kubenswrapper[4708]: I0320 16:23:40.205926 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a8464020-3eec-491d-b1ba-12afbf0b017c","Type":"ContainerStarted","Data":"be1757473f688a3e84fabe2608ebf0d9b5f6683b965f8d161d05624c58221e31"} Mar 20 16:23:40 crc kubenswrapper[4708]: I0320 16:23:40.248421 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-2hpvr" podStartSLOduration=3.248401893 podStartE2EDuration="3.248401893s" podCreationTimestamp="2026-03-20 16:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:40.236266133 +0000 UTC m=+1374.910602848" watchObservedRunningTime="2026-03-20 16:23:40.248401893 +0000 UTC m=+1374.922738608" Mar 20 16:23:41 crc kubenswrapper[4708]: I0320 16:23:41.217468 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" event={"ID":"fc8feabe-d295-4e0a-b541-9bfd25628128","Type":"ContainerStarted","Data":"0063a85d53b4de8c9ef6265aee8aea5aeedd224c1af6af410dfcfbaa21d2e673"} Mar 20 16:23:41 crc kubenswrapper[4708]: I0320 16:23:41.217955 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:41 crc kubenswrapper[4708]: I0320 16:23:41.220159 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zpnrd" event={"ID":"384f206f-1142-4027-90ef-7adfeda8a5f5","Type":"ContainerStarted","Data":"b3655bcb7ad6f49cf31a88c893fcf8827f27a99cacd08b0af71387e156bbbf09"} Mar 20 16:23:41 crc kubenswrapper[4708]: I0320 16:23:41.220377 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zpnrd" event={"ID":"384f206f-1142-4027-90ef-7adfeda8a5f5","Type":"ContainerStarted","Data":"aba21bd87cb348927d35f12b05794dec9d58f7fa65e28c032541099adbce8ab1"} Mar 20 16:23:41 crc kubenswrapper[4708]: I0320 16:23:41.236476 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" podStartSLOduration=3.23645883 podStartE2EDuration="3.23645883s" podCreationTimestamp="2026-03-20 16:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:41.232367249 +0000 UTC m=+1375.906703974" watchObservedRunningTime="2026-03-20 16:23:41.23645883 +0000 UTC m=+1375.910795545" Mar 20 16:23:41 crc kubenswrapper[4708]: I0320 16:23:41.257076 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zpnrd" podStartSLOduration=2.257058869 podStartE2EDuration="2.257058869s" podCreationTimestamp="2026-03-20 16:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:41.248420315 +0000 UTC m=+1375.922757030" watchObservedRunningTime="2026-03-20 16:23:41.257058869 +0000 UTC m=+1375.931395584" Mar 20 16:23:42 crc kubenswrapper[4708]: I0320 16:23:42.080633 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:23:42 crc kubenswrapper[4708]: I0320 16:23:42.099247 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:23:43 crc kubenswrapper[4708]: I0320 16:23:43.246648 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a8464020-3eec-491d-b1ba-12afbf0b017c","Type":"ContainerStarted","Data":"2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61"} Mar 20 16:23:43 crc kubenswrapper[4708]: I0320 16:23:43.253866 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4","Type":"ContainerStarted","Data":"27ba12f7d84ac8edb2c5b559c04d5ce79ae4491d1838c76577e0b6c59b8e095e"} Mar 20 16:23:43 crc kubenswrapper[4708]: I0320 16:23:43.254055 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" containerName="nova-metadata-log" containerID="cri-o://27ba12f7d84ac8edb2c5b559c04d5ce79ae4491d1838c76577e0b6c59b8e095e" gracePeriod=30 Mar 20 16:23:43 crc kubenswrapper[4708]: I0320 16:23:43.254382 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" containerName="nova-metadata-metadata" containerID="cri-o://3b456a81cb7201f04a2346e2de43280f2de53da90db40ec10ff5b5e2e9c7b99c" gracePeriod=30 Mar 20 16:23:43 crc kubenswrapper[4708]: I0320 16:23:43.263878 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ddd133b9-300b-403d-a469-58790dee930c","Type":"ContainerStarted","Data":"13cbad05c000901dcea7491f96b1f0073385a627c23a0dc556a431c8a31067d2"} Mar 20 16:23:43 crc kubenswrapper[4708]: I0320 16:23:43.264075 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ddd133b9-300b-403d-a469-58790dee930c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://13cbad05c000901dcea7491f96b1f0073385a627c23a0dc556a431c8a31067d2" gracePeriod=30 Mar 20 16:23:43 crc kubenswrapper[4708]: I0320 16:23:43.280460 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.2707705000000002 podStartE2EDuration="5.280436077s" podCreationTimestamp="2026-03-20 16:23:38 +0000 UTC" firstStartedPulling="2026-03-20 16:23:39.633804914 +0000 UTC m=+1374.308141629" lastFinishedPulling="2026-03-20 16:23:42.643470491 +0000 UTC m=+1377.317807206" observedRunningTime="2026-03-20 16:23:43.265049139 +0000 UTC m=+1377.939385854" watchObservedRunningTime="2026-03-20 16:23:43.280436077 +0000 UTC m=+1377.954772812" Mar 20 16:23:43 crc kubenswrapper[4708]: I0320 16:23:43.282021 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b1b6cd1a-61a4-403a-964e-f502cb67882f","Type":"ContainerStarted","Data":"4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444"} Mar 20 16:23:43 crc kubenswrapper[4708]: I0320 16:23:43.300899 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.406138388 podStartE2EDuration="5.300880253s" podCreationTimestamp="2026-03-20 16:23:38 +0000 UTC" firstStartedPulling="2026-03-20 16:23:39.750389771 +0000 UTC m=+1374.424726486" lastFinishedPulling="2026-03-20 16:23:42.645131636 +0000 UTC m=+1377.319468351" observedRunningTime="2026-03-20 16:23:43.286221845 +0000 UTC m=+1377.960558560" watchObservedRunningTime="2026-03-20 16:23:43.300880253 +0000 UTC m=+1377.975216958" Mar 20 16:23:43 crc kubenswrapper[4708]: I0320 16:23:43.338436 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.381668964 podStartE2EDuration="5.338417733s" podCreationTimestamp="2026-03-20 16:23:38 +0000 UTC" firstStartedPulling="2026-03-20 16:23:39.688417978 +0000 UTC m=+1374.362754693" lastFinishedPulling="2026-03-20 16:23:42.645166747 +0000 UTC m=+1377.319503462" observedRunningTime="2026-03-20 16:23:43.302285141 +0000 UTC m=+1377.976621866" watchObservedRunningTime="2026-03-20 16:23:43.338417733 +0000 UTC m=+1378.012754448" Mar 20 16:23:43 crc kubenswrapper[4708]: I0320 16:23:43.588471 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 16:23:43 crc kubenswrapper[4708]: I0320 16:23:43.972580 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:23:44 crc kubenswrapper[4708]: I0320 16:23:44.295204 4708 generic.go:334] "Generic (PLEG): container finished" podID="41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" containerID="27ba12f7d84ac8edb2c5b559c04d5ce79ae4491d1838c76577e0b6c59b8e095e" exitCode=143 Mar 20 16:23:44 crc kubenswrapper[4708]: I0320 16:23:44.295302 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4","Type":"ContainerStarted","Data":"3b456a81cb7201f04a2346e2de43280f2de53da90db40ec10ff5b5e2e9c7b99c"} Mar 20 16:23:44 crc kubenswrapper[4708]: I0320 16:23:44.295371 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4","Type":"ContainerDied","Data":"27ba12f7d84ac8edb2c5b559c04d5ce79ae4491d1838c76577e0b6c59b8e095e"} Mar 20 16:23:44 crc kubenswrapper[4708]: I0320 16:23:44.298482 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b1b6cd1a-61a4-403a-964e-f502cb67882f","Type":"ContainerStarted","Data":"82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460"} Mar 20 16:23:44 crc kubenswrapper[4708]: I0320 16:23:44.329918 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.420282586 podStartE2EDuration="6.329899533s" podCreationTimestamp="2026-03-20 16:23:38 +0000 UTC" firstStartedPulling="2026-03-20 16:23:39.738451098 +0000 UTC m=+1374.412787813" lastFinishedPulling="2026-03-20 16:23:42.648068045 +0000 UTC m=+1377.322404760" observedRunningTime="2026-03-20 16:23:44.316242571 +0000 UTC m=+1378.990579306" watchObservedRunningTime="2026-03-20 16:23:44.329899533 +0000 UTC m=+1379.004236248" Mar 20 16:23:47 crc kubenswrapper[4708]: I0320 16:23:47.328049 4708 generic.go:334] "Generic (PLEG): container finished" podID="af676e7f-5129-436c-9451-8a9b1c8c19c0" containerID="b1ec4d5b3497d1f22acdc9688aa9dfb36a8195784ada83e5b4f1e1bea401c955" exitCode=0 Mar 20 16:23:47 crc kubenswrapper[4708]: I0320 16:23:47.328142 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2hpvr" event={"ID":"af676e7f-5129-436c-9451-8a9b1c8c19c0","Type":"ContainerDied","Data":"b1ec4d5b3497d1f22acdc9688aa9dfb36a8195784ada83e5b4f1e1bea401c955"} Mar 20 16:23:47 crc kubenswrapper[4708]: I0320 16:23:47.356018 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.587762 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.618840 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.741821 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.763989 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.805729 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.805787 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.816809 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjhh4\" (UniqueName: \"kubernetes.io/projected/af676e7f-5129-436c-9451-8a9b1c8c19c0-kube-api-access-kjhh4\") pod \"af676e7f-5129-436c-9451-8a9b1c8c19c0\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.816951 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-scripts\") pod \"af676e7f-5129-436c-9451-8a9b1c8c19c0\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.816990 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-config-data\") pod \"af676e7f-5129-436c-9451-8a9b1c8c19c0\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.817133 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-combined-ca-bundle\") pod \"af676e7f-5129-436c-9451-8a9b1c8c19c0\" (UID: \"af676e7f-5129-436c-9451-8a9b1c8c19c0\") " Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.827967 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-scripts" (OuterVolumeSpecName: "scripts") pod "af676e7f-5129-436c-9451-8a9b1c8c19c0" (UID: "af676e7f-5129-436c-9451-8a9b1c8c19c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.853913 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5dlpz"] Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.854198 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" podUID="484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" containerName="dnsmasq-dns" containerID="cri-o://a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316" gracePeriod=10 Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.862719 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af676e7f-5129-436c-9451-8a9b1c8c19c0-kube-api-access-kjhh4" (OuterVolumeSpecName: "kube-api-access-kjhh4") pod "af676e7f-5129-436c-9451-8a9b1c8c19c0" (UID: "af676e7f-5129-436c-9451-8a9b1c8c19c0"). InnerVolumeSpecName "kube-api-access-kjhh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.889180 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-config-data" (OuterVolumeSpecName: "config-data") pod "af676e7f-5129-436c-9451-8a9b1c8c19c0" (UID: "af676e7f-5129-436c-9451-8a9b1c8c19c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.909785 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af676e7f-5129-436c-9451-8a9b1c8c19c0" (UID: "af676e7f-5129-436c-9451-8a9b1c8c19c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.920202 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.920568 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.920660 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjhh4\" (UniqueName: \"kubernetes.io/projected/af676e7f-5129-436c-9451-8a9b1c8c19c0-kube-api-access-kjhh4\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:48 crc kubenswrapper[4708]: I0320 16:23:48.921188 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af676e7f-5129-436c-9451-8a9b1c8c19c0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.304132 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.366951 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2hpvr" event={"ID":"af676e7f-5129-436c-9451-8a9b1c8c19c0","Type":"ContainerDied","Data":"84915d182a57c84c0d30885753a625a1d23becd15174c9ff41a9700696654d79"} Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.367005 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84915d182a57c84c0d30885753a625a1d23becd15174c9ff41a9700696654d79" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.367091 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2hpvr" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.373015 4708 generic.go:334] "Generic (PLEG): container finished" podID="384f206f-1142-4027-90ef-7adfeda8a5f5" containerID="b3655bcb7ad6f49cf31a88c893fcf8827f27a99cacd08b0af71387e156bbbf09" exitCode=0 Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.373111 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zpnrd" event={"ID":"384f206f-1142-4027-90ef-7adfeda8a5f5","Type":"ContainerDied","Data":"b3655bcb7ad6f49cf31a88c893fcf8827f27a99cacd08b0af71387e156bbbf09"} Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.377039 4708 generic.go:334] "Generic (PLEG): container finished" podID="484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" containerID="a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316" exitCode=0 Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.377815 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.377879 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" event={"ID":"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd","Type":"ContainerDied","Data":"a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316"} Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.377934 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5dlpz" event={"ID":"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd","Type":"ContainerDied","Data":"75dd15ffbfb5cd14a49ac8ae81cd1881c76f359a48b317bfcacce62723be43a3"} Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.377975 4708 scope.go:117] "RemoveContainer" containerID="a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.410836 4708 scope.go:117] "RemoveContainer" containerID="b555b5cfc5ceb4d5d5a1e7f305edaecc7869b07e6a854782790909894e85b7dc" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.429554 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-dns-swift-storage-0\") pod \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.429617 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs89r\" (UniqueName: \"kubernetes.io/projected/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-kube-api-access-vs89r\") pod \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.430918 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-config\") pod \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.431057 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-ovsdbserver-sb\") pod \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.431163 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-dns-svc\") pod \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.431239 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-ovsdbserver-nb\") pod \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\" (UID: \"484ca5ef-d4b8-4c48-8c70-84ff9059dcdd\") " Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.436999 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-kube-api-access-vs89r" (OuterVolumeSpecName: "kube-api-access-vs89r") pod "484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" (UID: "484ca5ef-d4b8-4c48-8c70-84ff9059dcdd"). InnerVolumeSpecName "kube-api-access-vs89r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.450747 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.457112 4708 scope.go:117] "RemoveContainer" containerID="a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316" Mar 20 16:23:49 crc kubenswrapper[4708]: E0320 16:23:49.463877 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316\": container with ID starting with a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316 not found: ID does not exist" containerID="a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.463926 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316"} err="failed to get container status \"a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316\": rpc error: code = NotFound desc = could not find container \"a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316\": container with ID starting with a7817c05a609678f8a4be48bdb66b23826b4c61c84865660b230eab5e21b6316 not found: ID does not exist" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.463948 4708 scope.go:117] "RemoveContainer" containerID="b555b5cfc5ceb4d5d5a1e7f305edaecc7869b07e6a854782790909894e85b7dc" Mar 20 16:23:49 crc kubenswrapper[4708]: E0320 16:23:49.464575 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b555b5cfc5ceb4d5d5a1e7f305edaecc7869b07e6a854782790909894e85b7dc\": container with ID starting with b555b5cfc5ceb4d5d5a1e7f305edaecc7869b07e6a854782790909894e85b7dc not found: ID does not exist" containerID="b555b5cfc5ceb4d5d5a1e7f305edaecc7869b07e6a854782790909894e85b7dc" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.464644 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b555b5cfc5ceb4d5d5a1e7f305edaecc7869b07e6a854782790909894e85b7dc"} err="failed to get container status \"b555b5cfc5ceb4d5d5a1e7f305edaecc7869b07e6a854782790909894e85b7dc\": rpc error: code = NotFound desc = could not find container \"b555b5cfc5ceb4d5d5a1e7f305edaecc7869b07e6a854782790909894e85b7dc\": container with ID starting with b555b5cfc5ceb4d5d5a1e7f305edaecc7869b07e6a854782790909894e85b7dc not found: ID does not exist" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.492578 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" (UID: "484ca5ef-d4b8-4c48-8c70-84ff9059dcdd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.498092 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" (UID: "484ca5ef-d4b8-4c48-8c70-84ff9059dcdd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.513513 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" (UID: "484ca5ef-d4b8-4c48-8c70-84ff9059dcdd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.522914 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-config" (OuterVolumeSpecName: "config") pod "484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" (UID: "484ca5ef-d4b8-4c48-8c70-84ff9059dcdd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.534171 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.534245 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.534255 4708 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.534265 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs89r\" (UniqueName: \"kubernetes.io/projected/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-kube-api-access-vs89r\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.534276 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.534519 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" (UID: "484ca5ef-d4b8-4c48-8c70-84ff9059dcdd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.560004 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.560306 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b1b6cd1a-61a4-403a-964e-f502cb67882f" containerName="nova-api-log" containerID="cri-o://4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444" gracePeriod=30 Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.560372 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b1b6cd1a-61a4-403a-964e-f502cb67882f" containerName="nova-api-api" containerID="cri-o://82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460" gracePeriod=30 Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.566223 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b1b6cd1a-61a4-403a-964e-f502cb67882f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": EOF" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.566463 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b1b6cd1a-61a4-403a-964e-f502cb67882f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": EOF" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.636341 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.780136 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5dlpz"] Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.789204 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5dlpz"] Mar 20 16:23:49 crc kubenswrapper[4708]: I0320 16:23:49.986420 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:23:50 crc kubenswrapper[4708]: I0320 16:23:50.122262 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" path="/var/lib/kubelet/pods/484ca5ef-d4b8-4c48-8c70-84ff9059dcdd/volumes" Mar 20 16:23:50 crc kubenswrapper[4708]: I0320 16:23:50.389361 4708 generic.go:334] "Generic (PLEG): container finished" podID="b1b6cd1a-61a4-403a-964e-f502cb67882f" containerID="4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444" exitCode=143 Mar 20 16:23:50 crc kubenswrapper[4708]: I0320 16:23:50.389438 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b1b6cd1a-61a4-403a-964e-f502cb67882f","Type":"ContainerDied","Data":"4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444"} Mar 20 16:23:50 crc kubenswrapper[4708]: I0320 16:23:50.951029 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.064162 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8cdx\" (UniqueName: \"kubernetes.io/projected/384f206f-1142-4027-90ef-7adfeda8a5f5-kube-api-access-d8cdx\") pod \"384f206f-1142-4027-90ef-7adfeda8a5f5\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.064476 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-combined-ca-bundle\") pod \"384f206f-1142-4027-90ef-7adfeda8a5f5\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.064582 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-scripts\") pod \"384f206f-1142-4027-90ef-7adfeda8a5f5\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.064708 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-config-data\") pod \"384f206f-1142-4027-90ef-7adfeda8a5f5\" (UID: \"384f206f-1142-4027-90ef-7adfeda8a5f5\") " Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.072917 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/384f206f-1142-4027-90ef-7adfeda8a5f5-kube-api-access-d8cdx" (OuterVolumeSpecName: "kube-api-access-d8cdx") pod "384f206f-1142-4027-90ef-7adfeda8a5f5" (UID: "384f206f-1142-4027-90ef-7adfeda8a5f5"). InnerVolumeSpecName "kube-api-access-d8cdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.072979 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-scripts" (OuterVolumeSpecName: "scripts") pod "384f206f-1142-4027-90ef-7adfeda8a5f5" (UID: "384f206f-1142-4027-90ef-7adfeda8a5f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.095583 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-config-data" (OuterVolumeSpecName: "config-data") pod "384f206f-1142-4027-90ef-7adfeda8a5f5" (UID: "384f206f-1142-4027-90ef-7adfeda8a5f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.108789 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "384f206f-1142-4027-90ef-7adfeda8a5f5" (UID: "384f206f-1142-4027-90ef-7adfeda8a5f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.167909 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.167945 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.167957 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384f206f-1142-4027-90ef-7adfeda8a5f5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.167967 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8cdx\" (UniqueName: \"kubernetes.io/projected/384f206f-1142-4027-90ef-7adfeda8a5f5-kube-api-access-d8cdx\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.402599 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a8464020-3eec-491d-b1ba-12afbf0b017c" containerName="nova-scheduler-scheduler" containerID="cri-o://2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61" gracePeriod=30 Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.403043 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zpnrd" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.409137 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zpnrd" event={"ID":"384f206f-1142-4027-90ef-7adfeda8a5f5","Type":"ContainerDied","Data":"aba21bd87cb348927d35f12b05794dec9d58f7fa65e28c032541099adbce8ab1"} Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.409187 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aba21bd87cb348927d35f12b05794dec9d58f7fa65e28c032541099adbce8ab1" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.491783 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 16:23:51 crc kubenswrapper[4708]: E0320 16:23:51.492298 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" containerName="init" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.492322 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" containerName="init" Mar 20 16:23:51 crc kubenswrapper[4708]: E0320 16:23:51.492336 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af676e7f-5129-436c-9451-8a9b1c8c19c0" containerName="nova-manage" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.492343 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="af676e7f-5129-436c-9451-8a9b1c8c19c0" containerName="nova-manage" Mar 20 16:23:51 crc kubenswrapper[4708]: E0320 16:23:51.492358 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" containerName="dnsmasq-dns" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.492364 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" containerName="dnsmasq-dns" Mar 20 16:23:51 crc kubenswrapper[4708]: E0320 16:23:51.492393 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384f206f-1142-4027-90ef-7adfeda8a5f5" containerName="nova-cell1-conductor-db-sync" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.492398 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="384f206f-1142-4027-90ef-7adfeda8a5f5" containerName="nova-cell1-conductor-db-sync" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.492598 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="384f206f-1142-4027-90ef-7adfeda8a5f5" containerName="nova-cell1-conductor-db-sync" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.492628 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="af676e7f-5129-436c-9451-8a9b1c8c19c0" containerName="nova-manage" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.492641 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="484ca5ef-d4b8-4c48-8c70-84ff9059dcdd" containerName="dnsmasq-dns" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.493522 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.498381 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.509834 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.680065 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd84187f-6a03-4149-89b8-bc697c1ad82c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dd84187f-6a03-4149-89b8-bc697c1ad82c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.680269 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd84187f-6a03-4149-89b8-bc697c1ad82c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dd84187f-6a03-4149-89b8-bc697c1ad82c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.680347 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvmrx\" (UniqueName: \"kubernetes.io/projected/dd84187f-6a03-4149-89b8-bc697c1ad82c-kube-api-access-gvmrx\") pod \"nova-cell1-conductor-0\" (UID: \"dd84187f-6a03-4149-89b8-bc697c1ad82c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.782322 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd84187f-6a03-4149-89b8-bc697c1ad82c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dd84187f-6a03-4149-89b8-bc697c1ad82c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.782410 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvmrx\" (UniqueName: \"kubernetes.io/projected/dd84187f-6a03-4149-89b8-bc697c1ad82c-kube-api-access-gvmrx\") pod \"nova-cell1-conductor-0\" (UID: \"dd84187f-6a03-4149-89b8-bc697c1ad82c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.782487 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd84187f-6a03-4149-89b8-bc697c1ad82c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dd84187f-6a03-4149-89b8-bc697c1ad82c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.796514 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd84187f-6a03-4149-89b8-bc697c1ad82c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dd84187f-6a03-4149-89b8-bc697c1ad82c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.796546 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd84187f-6a03-4149-89b8-bc697c1ad82c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dd84187f-6a03-4149-89b8-bc697c1ad82c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.800827 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvmrx\" (UniqueName: \"kubernetes.io/projected/dd84187f-6a03-4149-89b8-bc697c1ad82c-kube-api-access-gvmrx\") pod \"nova-cell1-conductor-0\" (UID: \"dd84187f-6a03-4149-89b8-bc697c1ad82c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 16:23:51 crc kubenswrapper[4708]: I0320 16:23:51.816416 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 16:23:52 crc kubenswrapper[4708]: I0320 16:23:52.300799 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:23:52 crc kubenswrapper[4708]: I0320 16:23:52.301409 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="301fee8b-2c0b-46c6-810c-a89d85b4efb4" containerName="kube-state-metrics" containerID="cri-o://2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06" gracePeriod=30 Mar 20 16:23:52 crc kubenswrapper[4708]: I0320 16:23:52.323778 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 16:23:52 crc kubenswrapper[4708]: I0320 16:23:52.416556 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dd84187f-6a03-4149-89b8-bc697c1ad82c","Type":"ContainerStarted","Data":"69c97235d69bae9cc08e371140751d1ebe83973d53c5c5e08c4fe0121732ae33"} Mar 20 16:23:52 crc kubenswrapper[4708]: I0320 16:23:52.752382 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 16:23:52 crc kubenswrapper[4708]: I0320 16:23:52.909663 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxnhb\" (UniqueName: \"kubernetes.io/projected/301fee8b-2c0b-46c6-810c-a89d85b4efb4-kube-api-access-rxnhb\") pod \"301fee8b-2c0b-46c6-810c-a89d85b4efb4\" (UID: \"301fee8b-2c0b-46c6-810c-a89d85b4efb4\") " Mar 20 16:23:52 crc kubenswrapper[4708]: I0320 16:23:52.917871 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301fee8b-2c0b-46c6-810c-a89d85b4efb4-kube-api-access-rxnhb" (OuterVolumeSpecName: "kube-api-access-rxnhb") pod "301fee8b-2c0b-46c6-810c-a89d85b4efb4" (UID: "301fee8b-2c0b-46c6-810c-a89d85b4efb4"). InnerVolumeSpecName "kube-api-access-rxnhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.011996 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxnhb\" (UniqueName: \"kubernetes.io/projected/301fee8b-2c0b-46c6-810c-a89d85b4efb4-kube-api-access-rxnhb\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.428984 4708 generic.go:334] "Generic (PLEG): container finished" podID="301fee8b-2c0b-46c6-810c-a89d85b4efb4" containerID="2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06" exitCode=2 Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.429080 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"301fee8b-2c0b-46c6-810c-a89d85b4efb4","Type":"ContainerDied","Data":"2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06"} Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.429143 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"301fee8b-2c0b-46c6-810c-a89d85b4efb4","Type":"ContainerDied","Data":"643fc60b44a7b26cfda80e35eb9dddcf0292fa1fbb852a49eaeac6289e248613"} Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.429100 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.429169 4708 scope.go:117] "RemoveContainer" containerID="2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.432091 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dd84187f-6a03-4149-89b8-bc697c1ad82c","Type":"ContainerStarted","Data":"49e8eeae7c3eaf24d1909919058fab17efa6b4290001e2d429391be2468ee336"} Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.432990 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.467089 4708 scope.go:117] "RemoveContainer" containerID="2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06" Mar 20 16:23:53 crc kubenswrapper[4708]: E0320 16:23:53.467724 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06\": container with ID starting with 2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06 not found: ID does not exist" containerID="2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.467770 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06"} err="failed to get container status \"2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06\": rpc error: code = NotFound desc = could not find container \"2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06\": container with ID starting with 2d32a9c3cbef0fab74142d9c3ff8e5f003b30df025bb7b43c6549059027a5f06 not found: ID does not exist" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.486960 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.48694001 podStartE2EDuration="2.48694001s" podCreationTimestamp="2026-03-20 16:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:53.482754107 +0000 UTC m=+1388.157090822" watchObservedRunningTime="2026-03-20 16:23:53.48694001 +0000 UTC m=+1388.161276725" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.512379 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.536229 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.555752 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:23:53 crc kubenswrapper[4708]: E0320 16:23:53.556528 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301fee8b-2c0b-46c6-810c-a89d85b4efb4" containerName="kube-state-metrics" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.556629 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="301fee8b-2c0b-46c6-810c-a89d85b4efb4" containerName="kube-state-metrics" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.556911 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="301fee8b-2c0b-46c6-810c-a89d85b4efb4" containerName="kube-state-metrics" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.557697 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.563501 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.563776 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.577055 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:23:53 crc kubenswrapper[4708]: E0320 16:23:53.617866 4708 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 16:23:53 crc kubenswrapper[4708]: E0320 16:23:53.624111 4708 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 16:23:53 crc kubenswrapper[4708]: E0320 16:23:53.631623 4708 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 16:23:53 crc kubenswrapper[4708]: E0320 16:23:53.631745 4708 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a8464020-3eec-491d-b1ba-12afbf0b017c" containerName="nova-scheduler-scheduler" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.729154 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zxjw\" (UniqueName: \"kubernetes.io/projected/1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab-kube-api-access-4zxjw\") pod \"kube-state-metrics-0\" (UID: \"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab\") " pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.729248 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab\") " pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.729273 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab\") " pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.729532 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab\") " pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.831061 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab\") " pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.831418 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab\") " pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.831574 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab\") " pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.831655 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zxjw\" (UniqueName: \"kubernetes.io/projected/1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab-kube-api-access-4zxjw\") pod \"kube-state-metrics-0\" (UID: \"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab\") " pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.850909 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab\") " pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.855976 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab\") " pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.856272 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab\") " pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.864607 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zxjw\" (UniqueName: \"kubernetes.io/projected/1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab-kube-api-access-4zxjw\") pod \"kube-state-metrics-0\" (UID: \"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab\") " pod="openstack/kube-state-metrics-0" Mar 20 16:23:53 crc kubenswrapper[4708]: I0320 16:23:53.882262 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 16:23:54 crc kubenswrapper[4708]: I0320 16:23:54.164929 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301fee8b-2c0b-46c6-810c-a89d85b4efb4" path="/var/lib/kubelet/pods/301fee8b-2c0b-46c6-810c-a89d85b4efb4/volumes" Mar 20 16:23:54 crc kubenswrapper[4708]: I0320 16:23:54.200908 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 16:23:54 crc kubenswrapper[4708]: I0320 16:23:54.449194 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab","Type":"ContainerStarted","Data":"f19143cc8d1bcdaa6c5683547a5e0d613a0e7ce2dd56877bf73f6ce30d756a22"} Mar 20 16:23:54 crc kubenswrapper[4708]: I0320 16:23:54.678062 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:54 crc kubenswrapper[4708]: I0320 16:23:54.678382 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="ceilometer-central-agent" containerID="cri-o://5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f" gracePeriod=30 Mar 20 16:23:54 crc kubenswrapper[4708]: I0320 16:23:54.678921 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="proxy-httpd" containerID="cri-o://63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0" gracePeriod=30 Mar 20 16:23:54 crc kubenswrapper[4708]: I0320 16:23:54.678989 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="sg-core" containerID="cri-o://d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd" gracePeriod=30 Mar 20 16:23:54 crc kubenswrapper[4708]: I0320 16:23:54.679033 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="ceilometer-notification-agent" containerID="cri-o://b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15" gracePeriod=30 Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.059647 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.155604 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8464020-3eec-491d-b1ba-12afbf0b017c-config-data\") pod \"a8464020-3eec-491d-b1ba-12afbf0b017c\" (UID: \"a8464020-3eec-491d-b1ba-12afbf0b017c\") " Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.155732 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8464020-3eec-491d-b1ba-12afbf0b017c-combined-ca-bundle\") pod \"a8464020-3eec-491d-b1ba-12afbf0b017c\" (UID: \"a8464020-3eec-491d-b1ba-12afbf0b017c\") " Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.155769 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8gsw\" (UniqueName: \"kubernetes.io/projected/a8464020-3eec-491d-b1ba-12afbf0b017c-kube-api-access-h8gsw\") pod \"a8464020-3eec-491d-b1ba-12afbf0b017c\" (UID: \"a8464020-3eec-491d-b1ba-12afbf0b017c\") " Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.161259 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8464020-3eec-491d-b1ba-12afbf0b017c-kube-api-access-h8gsw" (OuterVolumeSpecName: "kube-api-access-h8gsw") pod "a8464020-3eec-491d-b1ba-12afbf0b017c" (UID: "a8464020-3eec-491d-b1ba-12afbf0b017c"). InnerVolumeSpecName "kube-api-access-h8gsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.184839 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8464020-3eec-491d-b1ba-12afbf0b017c-config-data" (OuterVolumeSpecName: "config-data") pod "a8464020-3eec-491d-b1ba-12afbf0b017c" (UID: "a8464020-3eec-491d-b1ba-12afbf0b017c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.195293 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8464020-3eec-491d-b1ba-12afbf0b017c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8464020-3eec-491d-b1ba-12afbf0b017c" (UID: "a8464020-3eec-491d-b1ba-12afbf0b017c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.257579 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8464020-3eec-491d-b1ba-12afbf0b017c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.257608 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8gsw\" (UniqueName: \"kubernetes.io/projected/a8464020-3eec-491d-b1ba-12afbf0b017c-kube-api-access-h8gsw\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.257618 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8464020-3eec-491d-b1ba-12afbf0b017c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.463892 4708 generic.go:334] "Generic (PLEG): container finished" podID="a8464020-3eec-491d-b1ba-12afbf0b017c" containerID="2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61" exitCode=0 Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.463978 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.464015 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a8464020-3eec-491d-b1ba-12afbf0b017c","Type":"ContainerDied","Data":"2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61"} Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.464114 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a8464020-3eec-491d-b1ba-12afbf0b017c","Type":"ContainerDied","Data":"be1757473f688a3e84fabe2608ebf0d9b5f6683b965f8d161d05624c58221e31"} Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.464171 4708 scope.go:117] "RemoveContainer" containerID="2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.467791 4708 generic.go:334] "Generic (PLEG): container finished" podID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerID="63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0" exitCode=0 Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.467824 4708 generic.go:334] "Generic (PLEG): container finished" podID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerID="d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd" exitCode=2 Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.467836 4708 generic.go:334] "Generic (PLEG): container finished" podID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerID="5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f" exitCode=0 Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.467880 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de69fe5a-0a95-4387-9c43-4ff1fe53bad4","Type":"ContainerDied","Data":"63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0"} Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.467907 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de69fe5a-0a95-4387-9c43-4ff1fe53bad4","Type":"ContainerDied","Data":"d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd"} Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.467918 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de69fe5a-0a95-4387-9c43-4ff1fe53bad4","Type":"ContainerDied","Data":"5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f"} Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.469673 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab","Type":"ContainerStarted","Data":"27f3004158c103635dbb2539a38bf820f67bb54ed66c5b9882528c313b13daca"} Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.469938 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.500097 4708 scope.go:117] "RemoveContainer" containerID="2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61" Mar 20 16:23:55 crc kubenswrapper[4708]: E0320 16:23:55.501519 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61\": container with ID starting with 2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61 not found: ID does not exist" containerID="2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.501549 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61"} err="failed to get container status \"2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61\": rpc error: code = NotFound desc = could not find container \"2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61\": container with ID starting with 2d0c9d37ce83a3098ac9f2e67da2b69c11749930f96301f26ef032b58d041a61 not found: ID does not exist" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.522369 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.107922105 podStartE2EDuration="2.522343025s" podCreationTimestamp="2026-03-20 16:23:53 +0000 UTC" firstStartedPulling="2026-03-20 16:23:54.21856249 +0000 UTC m=+1388.892899205" lastFinishedPulling="2026-03-20 16:23:54.63298341 +0000 UTC m=+1389.307320125" observedRunningTime="2026-03-20 16:23:55.500763678 +0000 UTC m=+1390.175100393" watchObservedRunningTime="2026-03-20 16:23:55.522343025 +0000 UTC m=+1390.196679740" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.528180 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.549434 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.564577 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:23:55 crc kubenswrapper[4708]: E0320 16:23:55.565360 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8464020-3eec-491d-b1ba-12afbf0b017c" containerName="nova-scheduler-scheduler" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.565376 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8464020-3eec-491d-b1ba-12afbf0b017c" containerName="nova-scheduler-scheduler" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.565579 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8464020-3eec-491d-b1ba-12afbf0b017c" containerName="nova-scheduler-scheduler" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.566188 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.569203 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.571267 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.665318 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6ea07b-75ea-4861-866a-361b91e1c277-config-data\") pod \"nova-scheduler-0\" (UID: \"bb6ea07b-75ea-4861-866a-361b91e1c277\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.665482 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6ea07b-75ea-4861-866a-361b91e1c277-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bb6ea07b-75ea-4861-866a-361b91e1c277\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.665843 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcfbx\" (UniqueName: \"kubernetes.io/projected/bb6ea07b-75ea-4861-866a-361b91e1c277-kube-api-access-gcfbx\") pod \"nova-scheduler-0\" (UID: \"bb6ea07b-75ea-4861-866a-361b91e1c277\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.767225 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcfbx\" (UniqueName: \"kubernetes.io/projected/bb6ea07b-75ea-4861-866a-361b91e1c277-kube-api-access-gcfbx\") pod \"nova-scheduler-0\" (UID: \"bb6ea07b-75ea-4861-866a-361b91e1c277\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.767323 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6ea07b-75ea-4861-866a-361b91e1c277-config-data\") pod \"nova-scheduler-0\" (UID: \"bb6ea07b-75ea-4861-866a-361b91e1c277\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.767348 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6ea07b-75ea-4861-866a-361b91e1c277-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bb6ea07b-75ea-4861-866a-361b91e1c277\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.770849 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6ea07b-75ea-4861-866a-361b91e1c277-config-data\") pod \"nova-scheduler-0\" (UID: \"bb6ea07b-75ea-4861-866a-361b91e1c277\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.771130 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6ea07b-75ea-4861-866a-361b91e1c277-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bb6ea07b-75ea-4861-866a-361b91e1c277\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.788244 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcfbx\" (UniqueName: \"kubernetes.io/projected/bb6ea07b-75ea-4861-866a-361b91e1c277-kube-api-access-gcfbx\") pod \"nova-scheduler-0\" (UID: \"bb6ea07b-75ea-4861-866a-361b91e1c277\") " pod="openstack/nova-scheduler-0" Mar 20 16:23:55 crc kubenswrapper[4708]: I0320 16:23:55.895105 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.126388 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8464020-3eec-491d-b1ba-12afbf0b017c" path="/var/lib/kubelet/pods/a8464020-3eec-491d-b1ba-12afbf0b017c/volumes" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.178323 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.178398 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.352609 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.409037 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.480873 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bb6ea07b-75ea-4861-866a-361b91e1c277","Type":"ContainerStarted","Data":"6588566776bb504c233ee88e41783c20bc35256ae5e1c71e250ebd8bbf9ea7cc"} Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.483373 4708 generic.go:334] "Generic (PLEG): container finished" podID="b1b6cd1a-61a4-403a-964e-f502cb67882f" containerID="82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460" exitCode=0 Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.483434 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.483438 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b1b6cd1a-61a4-403a-964e-f502cb67882f","Type":"ContainerDied","Data":"82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460"} Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.483486 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b1b6cd1a-61a4-403a-964e-f502cb67882f","Type":"ContainerDied","Data":"69a9717d864b727278f50474a79cc92df62822441193fa75a3e13911cec35b4b"} Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.483509 4708 scope.go:117] "RemoveContainer" containerID="82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.528930 4708 scope.go:117] "RemoveContainer" containerID="4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.561577 4708 scope.go:117] "RemoveContainer" containerID="82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460" Mar 20 16:23:56 crc kubenswrapper[4708]: E0320 16:23:56.562067 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460\": container with ID starting with 82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460 not found: ID does not exist" containerID="82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.562113 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460"} err="failed to get container status \"82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460\": rpc error: code = NotFound desc = could not find container \"82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460\": container with ID starting with 82202465aaa43b6aedfda9523774c1d9493fe2c77f41a3d38a17082f9da02460 not found: ID does not exist" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.562144 4708 scope.go:117] "RemoveContainer" containerID="4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444" Mar 20 16:23:56 crc kubenswrapper[4708]: E0320 16:23:56.562808 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444\": container with ID starting with 4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444 not found: ID does not exist" containerID="4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.562855 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444"} err="failed to get container status \"4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444\": rpc error: code = NotFound desc = could not find container \"4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444\": container with ID starting with 4c223743b96debcc94fb27994e831c6c50a05811666597bf9e8c1d79d1300444 not found: ID does not exist" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.580333 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b6cd1a-61a4-403a-964e-f502cb67882f-combined-ca-bundle\") pod \"b1b6cd1a-61a4-403a-964e-f502cb67882f\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.580549 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b6cd1a-61a4-403a-964e-f502cb67882f-config-data\") pod \"b1b6cd1a-61a4-403a-964e-f502cb67882f\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.580649 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtsfm\" (UniqueName: \"kubernetes.io/projected/b1b6cd1a-61a4-403a-964e-f502cb67882f-kube-api-access-wtsfm\") pod \"b1b6cd1a-61a4-403a-964e-f502cb67882f\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.580764 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b6cd1a-61a4-403a-964e-f502cb67882f-logs\") pod \"b1b6cd1a-61a4-403a-964e-f502cb67882f\" (UID: \"b1b6cd1a-61a4-403a-964e-f502cb67882f\") " Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.581358 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b6cd1a-61a4-403a-964e-f502cb67882f-logs" (OuterVolumeSpecName: "logs") pod "b1b6cd1a-61a4-403a-964e-f502cb67882f" (UID: "b1b6cd1a-61a4-403a-964e-f502cb67882f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.583707 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b6cd1a-61a4-403a-964e-f502cb67882f-kube-api-access-wtsfm" (OuterVolumeSpecName: "kube-api-access-wtsfm") pod "b1b6cd1a-61a4-403a-964e-f502cb67882f" (UID: "b1b6cd1a-61a4-403a-964e-f502cb67882f"). InnerVolumeSpecName "kube-api-access-wtsfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.599303 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.599355 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.613878 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b6cd1a-61a4-403a-964e-f502cb67882f-config-data" (OuterVolumeSpecName: "config-data") pod "b1b6cd1a-61a4-403a-964e-f502cb67882f" (UID: "b1b6cd1a-61a4-403a-964e-f502cb67882f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.613928 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b6cd1a-61a4-403a-964e-f502cb67882f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1b6cd1a-61a4-403a-964e-f502cb67882f" (UID: "b1b6cd1a-61a4-403a-964e-f502cb67882f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.699267 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b6cd1a-61a4-403a-964e-f502cb67882f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.699307 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b6cd1a-61a4-403a-964e-f502cb67882f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.699316 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtsfm\" (UniqueName: \"kubernetes.io/projected/b1b6cd1a-61a4-403a-964e-f502cb67882f-kube-api-access-wtsfm\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.699325 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b6cd1a-61a4-403a-964e-f502cb67882f-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.876945 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.887814 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.926792 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 16:23:56 crc kubenswrapper[4708]: E0320 16:23:56.927421 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b6cd1a-61a4-403a-964e-f502cb67882f" containerName="nova-api-api" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.927445 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b6cd1a-61a4-403a-964e-f502cb67882f" containerName="nova-api-api" Mar 20 16:23:56 crc kubenswrapper[4708]: E0320 16:23:56.927462 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b6cd1a-61a4-403a-964e-f502cb67882f" containerName="nova-api-log" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.927471 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b6cd1a-61a4-403a-964e-f502cb67882f" containerName="nova-api-log" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.931778 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b6cd1a-61a4-403a-964e-f502cb67882f" containerName="nova-api-log" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.931824 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b6cd1a-61a4-403a-964e-f502cb67882f" containerName="nova-api-api" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.937225 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:23:56 crc kubenswrapper[4708]: I0320 16:23:56.964291 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.031077 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.117906 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7dh9\" (UniqueName: \"kubernetes.io/projected/67ea6ed7-5874-45db-94f4-b77cff6ddc40-kube-api-access-j7dh9\") pod \"nova-api-0\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.118001 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67ea6ed7-5874-45db-94f4-b77cff6ddc40-logs\") pod \"nova-api-0\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.118031 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ea6ed7-5874-45db-94f4-b77cff6ddc40-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.118229 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ea6ed7-5874-45db-94f4-b77cff6ddc40-config-data\") pod \"nova-api-0\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.143093 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.219620 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntbcn\" (UniqueName: \"kubernetes.io/projected/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-kube-api-access-ntbcn\") pod \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.219726 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-scripts\") pod \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.219764 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-log-httpd\") pod \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.219783 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-combined-ca-bundle\") pod \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.219828 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-config-data\") pod \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.219932 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-sg-core-conf-yaml\") pod \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.219990 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-run-httpd\") pod \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\" (UID: \"de69fe5a-0a95-4387-9c43-4ff1fe53bad4\") " Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.220230 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ea6ed7-5874-45db-94f4-b77cff6ddc40-config-data\") pod \"nova-api-0\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.220324 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7dh9\" (UniqueName: \"kubernetes.io/projected/67ea6ed7-5874-45db-94f4-b77cff6ddc40-kube-api-access-j7dh9\") pod \"nova-api-0\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.220440 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67ea6ed7-5874-45db-94f4-b77cff6ddc40-logs\") pod \"nova-api-0\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.220478 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ea6ed7-5874-45db-94f4-b77cff6ddc40-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.224006 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "de69fe5a-0a95-4387-9c43-4ff1fe53bad4" (UID: "de69fe5a-0a95-4387-9c43-4ff1fe53bad4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.225140 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "de69fe5a-0a95-4387-9c43-4ff1fe53bad4" (UID: "de69fe5a-0a95-4387-9c43-4ff1fe53bad4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.227244 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67ea6ed7-5874-45db-94f4-b77cff6ddc40-logs\") pod \"nova-api-0\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.227752 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ea6ed7-5874-45db-94f4-b77cff6ddc40-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.227821 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-kube-api-access-ntbcn" (OuterVolumeSpecName: "kube-api-access-ntbcn") pod "de69fe5a-0a95-4387-9c43-4ff1fe53bad4" (UID: "de69fe5a-0a95-4387-9c43-4ff1fe53bad4"). InnerVolumeSpecName "kube-api-access-ntbcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.232992 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-scripts" (OuterVolumeSpecName: "scripts") pod "de69fe5a-0a95-4387-9c43-4ff1fe53bad4" (UID: "de69fe5a-0a95-4387-9c43-4ff1fe53bad4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.234340 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ea6ed7-5874-45db-94f4-b77cff6ddc40-config-data\") pod \"nova-api-0\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.250200 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7dh9\" (UniqueName: \"kubernetes.io/projected/67ea6ed7-5874-45db-94f4-b77cff6ddc40-kube-api-access-j7dh9\") pod \"nova-api-0\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.263980 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "de69fe5a-0a95-4387-9c43-4ff1fe53bad4" (UID: "de69fe5a-0a95-4387-9c43-4ff1fe53bad4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.310461 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.322566 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntbcn\" (UniqueName: \"kubernetes.io/projected/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-kube-api-access-ntbcn\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.322599 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.322609 4708 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.322620 4708 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.322628 4708 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.352072 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de69fe5a-0a95-4387-9c43-4ff1fe53bad4" (UID: "de69fe5a-0a95-4387-9c43-4ff1fe53bad4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.363913 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-config-data" (OuterVolumeSpecName: "config-data") pod "de69fe5a-0a95-4387-9c43-4ff1fe53bad4" (UID: "de69fe5a-0a95-4387-9c43-4ff1fe53bad4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.424645 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.424705 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de69fe5a-0a95-4387-9c43-4ff1fe53bad4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.499964 4708 generic.go:334] "Generic (PLEG): container finished" podID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerID="b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15" exitCode=0 Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.500031 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de69fe5a-0a95-4387-9c43-4ff1fe53bad4","Type":"ContainerDied","Data":"b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15"} Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.500057 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de69fe5a-0a95-4387-9c43-4ff1fe53bad4","Type":"ContainerDied","Data":"64be44ce3fef71f67903ba7ea130ea6eb2b7bc35180c8ea67812f2b9033a5f45"} Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.500075 4708 scope.go:117] "RemoveContainer" containerID="63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.500210 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.505916 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bb6ea07b-75ea-4861-866a-361b91e1c277","Type":"ContainerStarted","Data":"eb25e4e247261583daa1a69711542c6735a3453bd1ba79928a55fd1b90a9063d"} Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.527371 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.527353574 podStartE2EDuration="2.527353574s" podCreationTimestamp="2026-03-20 16:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:57.525268796 +0000 UTC m=+1392.199605511" watchObservedRunningTime="2026-03-20 16:23:57.527353574 +0000 UTC m=+1392.201690279" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.534232 4708 scope.go:117] "RemoveContainer" containerID="d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.551659 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.574942 4708 scope.go:117] "RemoveContainer" containerID="b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.577288 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.586832 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:57 crc kubenswrapper[4708]: E0320 16:23:57.587302 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="ceilometer-central-agent" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.587321 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="ceilometer-central-agent" Mar 20 16:23:57 crc kubenswrapper[4708]: E0320 16:23:57.587356 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="ceilometer-notification-agent" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.587362 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="ceilometer-notification-agent" Mar 20 16:23:57 crc kubenswrapper[4708]: E0320 16:23:57.587377 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="sg-core" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.587384 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="sg-core" Mar 20 16:23:57 crc kubenswrapper[4708]: E0320 16:23:57.587398 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="proxy-httpd" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.587404 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="proxy-httpd" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.587645 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="proxy-httpd" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.587679 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="ceilometer-notification-agent" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.587698 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="sg-core" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.587708 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" containerName="ceilometer-central-agent" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.591278 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.595268 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.595786 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.599583 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.599976 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.604412 4708 scope.go:117] "RemoveContainer" containerID="5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.631485 4708 scope.go:117] "RemoveContainer" containerID="63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0" Mar 20 16:23:57 crc kubenswrapper[4708]: E0320 16:23:57.632208 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0\": container with ID starting with 63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0 not found: ID does not exist" containerID="63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.632258 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0"} err="failed to get container status \"63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0\": rpc error: code = NotFound desc = could not find container \"63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0\": container with ID starting with 63b249a2bc466f95db6595b73c11c6ea1d84141ad66b999da611de6951b800f0 not found: ID does not exist" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.632292 4708 scope.go:117] "RemoveContainer" containerID="d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd" Mar 20 16:23:57 crc kubenswrapper[4708]: E0320 16:23:57.632897 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd\": container with ID starting with d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd not found: ID does not exist" containerID="d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.632931 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd"} err="failed to get container status \"d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd\": rpc error: code = NotFound desc = could not find container \"d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd\": container with ID starting with d5644964c455a176fc1d0fb5420479033ee9f6d5635f57fa1dfdf8af1a6df1bd not found: ID does not exist" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.632953 4708 scope.go:117] "RemoveContainer" containerID="b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15" Mar 20 16:23:57 crc kubenswrapper[4708]: E0320 16:23:57.633373 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15\": container with ID starting with b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15 not found: ID does not exist" containerID="b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.633405 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15"} err="failed to get container status \"b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15\": rpc error: code = NotFound desc = could not find container \"b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15\": container with ID starting with b41988f55b40c1e0509c231218c77fdd566be4d7bba843231b4df3af8a53df15 not found: ID does not exist" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.633427 4708 scope.go:117] "RemoveContainer" containerID="5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f" Mar 20 16:23:57 crc kubenswrapper[4708]: E0320 16:23:57.633700 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f\": container with ID starting with 5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f not found: ID does not exist" containerID="5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.633727 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f"} err="failed to get container status \"5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f\": rpc error: code = NotFound desc = could not find container \"5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f\": container with ID starting with 5b50d88d311ea0df51fade51497879ed59e3156467ad793a44adf42400bf698f not found: ID does not exist" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.730496 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcxxq\" (UniqueName: \"kubernetes.io/projected/c5c28799-921e-4f2a-8b33-2025ddfe4b90-kube-api-access-jcxxq\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.730593 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-scripts\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.730646 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.730662 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-config-data\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.730706 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.730731 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5c28799-921e-4f2a-8b33-2025ddfe4b90-run-httpd\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.730755 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5c28799-921e-4f2a-8b33-2025ddfe4b90-log-httpd\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.730776 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.778234 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.833443 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-scripts\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.833653 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.838092 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-config-data\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.838174 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.838247 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5c28799-921e-4f2a-8b33-2025ddfe4b90-run-httpd\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.838303 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5c28799-921e-4f2a-8b33-2025ddfe4b90-log-httpd\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.838357 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.838476 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcxxq\" (UniqueName: \"kubernetes.io/projected/c5c28799-921e-4f2a-8b33-2025ddfe4b90-kube-api-access-jcxxq\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.839165 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5c28799-921e-4f2a-8b33-2025ddfe4b90-run-httpd\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.845492 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5c28799-921e-4f2a-8b33-2025ddfe4b90-log-httpd\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.847780 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-scripts\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.854653 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.863922 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-config-data\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.865072 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.880459 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.896598 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcxxq\" (UniqueName: \"kubernetes.io/projected/c5c28799-921e-4f2a-8b33-2025ddfe4b90-kube-api-access-jcxxq\") pod \"ceilometer-0\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " pod="openstack/ceilometer-0" Mar 20 16:23:57 crc kubenswrapper[4708]: I0320 16:23:57.914044 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:23:58 crc kubenswrapper[4708]: I0320 16:23:58.126950 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b6cd1a-61a4-403a-964e-f502cb67882f" path="/var/lib/kubelet/pods/b1b6cd1a-61a4-403a-964e-f502cb67882f/volumes" Mar 20 16:23:58 crc kubenswrapper[4708]: I0320 16:23:58.128474 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de69fe5a-0a95-4387-9c43-4ff1fe53bad4" path="/var/lib/kubelet/pods/de69fe5a-0a95-4387-9c43-4ff1fe53bad4/volumes" Mar 20 16:23:58 crc kubenswrapper[4708]: I0320 16:23:58.454318 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:23:58 crc kubenswrapper[4708]: I0320 16:23:58.521827 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67ea6ed7-5874-45db-94f4-b77cff6ddc40","Type":"ContainerStarted","Data":"a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba"} Mar 20 16:23:58 crc kubenswrapper[4708]: I0320 16:23:58.521884 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67ea6ed7-5874-45db-94f4-b77cff6ddc40","Type":"ContainerStarted","Data":"1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc"} Mar 20 16:23:58 crc kubenswrapper[4708]: I0320 16:23:58.521903 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67ea6ed7-5874-45db-94f4-b77cff6ddc40","Type":"ContainerStarted","Data":"4a0b653406c25eb160e02d01e34772a83f16fa67e7aab03bdd17640c53defcbf"} Mar 20 16:23:58 crc kubenswrapper[4708]: I0320 16:23:58.524043 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5c28799-921e-4f2a-8b33-2025ddfe4b90","Type":"ContainerStarted","Data":"738542e3e5a4a3f528995220e42ac9cff552895fd966f6d31d7c0cace5ccdc80"} Mar 20 16:23:58 crc kubenswrapper[4708]: I0320 16:23:58.544652 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5445929229999997 podStartE2EDuration="2.544592923s" podCreationTimestamp="2026-03-20 16:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:23:58.540807161 +0000 UTC m=+1393.215143876" watchObservedRunningTime="2026-03-20 16:23:58.544592923 +0000 UTC m=+1393.218929658" Mar 20 16:23:59 crc kubenswrapper[4708]: I0320 16:23:59.551189 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5c28799-921e-4f2a-8b33-2025ddfe4b90","Type":"ContainerStarted","Data":"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68"} Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.144284 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567064-kth9n"] Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.146134 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-kth9n" Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.148699 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.153064 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.153211 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.158205 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-kth9n"] Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.297281 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67gdw\" (UniqueName: \"kubernetes.io/projected/55f18481-391a-4e73-880c-f1d058ac7564-kube-api-access-67gdw\") pod \"auto-csr-approver-29567064-kth9n\" (UID: \"55f18481-391a-4e73-880c-f1d058ac7564\") " pod="openshift-infra/auto-csr-approver-29567064-kth9n" Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.399922 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67gdw\" (UniqueName: \"kubernetes.io/projected/55f18481-391a-4e73-880c-f1d058ac7564-kube-api-access-67gdw\") pod \"auto-csr-approver-29567064-kth9n\" (UID: \"55f18481-391a-4e73-880c-f1d058ac7564\") " pod="openshift-infra/auto-csr-approver-29567064-kth9n" Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.423549 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67gdw\" (UniqueName: \"kubernetes.io/projected/55f18481-391a-4e73-880c-f1d058ac7564-kube-api-access-67gdw\") pod \"auto-csr-approver-29567064-kth9n\" (UID: \"55f18481-391a-4e73-880c-f1d058ac7564\") " pod="openshift-infra/auto-csr-approver-29567064-kth9n" Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.475260 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-kth9n" Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.572325 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5c28799-921e-4f2a-8b33-2025ddfe4b90","Type":"ContainerStarted","Data":"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682"} Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.778957 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-kth9n"] Mar 20 16:24:00 crc kubenswrapper[4708]: I0320 16:24:00.895591 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 16:24:01 crc kubenswrapper[4708]: I0320 16:24:01.601370 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5c28799-921e-4f2a-8b33-2025ddfe4b90","Type":"ContainerStarted","Data":"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23"} Mar 20 16:24:01 crc kubenswrapper[4708]: I0320 16:24:01.603752 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567064-kth9n" event={"ID":"55f18481-391a-4e73-880c-f1d058ac7564","Type":"ContainerStarted","Data":"1f01617d5ffafd021515b061e9e4ec5fcd73dcc668b617995f042deec82000f6"} Mar 20 16:24:01 crc kubenswrapper[4708]: I0320 16:24:01.862161 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 16:24:02 crc kubenswrapper[4708]: I0320 16:24:02.617117 4708 generic.go:334] "Generic (PLEG): container finished" podID="55f18481-391a-4e73-880c-f1d058ac7564" containerID="10e907f3de9388e78a11a657c0d845a24907fd75ef92953dd76b9a81580245e8" exitCode=0 Mar 20 16:24:02 crc kubenswrapper[4708]: I0320 16:24:02.617230 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567064-kth9n" event={"ID":"55f18481-391a-4e73-880c-f1d058ac7564","Type":"ContainerDied","Data":"10e907f3de9388e78a11a657c0d845a24907fd75ef92953dd76b9a81580245e8"} Mar 20 16:24:03 crc kubenswrapper[4708]: I0320 16:24:03.630052 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5c28799-921e-4f2a-8b33-2025ddfe4b90","Type":"ContainerStarted","Data":"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2"} Mar 20 16:24:03 crc kubenswrapper[4708]: I0320 16:24:03.663463 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.406750678 podStartE2EDuration="6.663437458s" podCreationTimestamp="2026-03-20 16:23:57 +0000 UTC" firstStartedPulling="2026-03-20 16:23:58.452622334 +0000 UTC m=+1393.126959059" lastFinishedPulling="2026-03-20 16:24:02.709309124 +0000 UTC m=+1397.383645839" observedRunningTime="2026-03-20 16:24:03.657390884 +0000 UTC m=+1398.331727619" watchObservedRunningTime="2026-03-20 16:24:03.663437458 +0000 UTC m=+1398.337774173" Mar 20 16:24:03 crc kubenswrapper[4708]: I0320 16:24:03.909096 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 16:24:04 crc kubenswrapper[4708]: I0320 16:24:04.099969 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-kth9n" Mar 20 16:24:04 crc kubenswrapper[4708]: I0320 16:24:04.176119 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67gdw\" (UniqueName: \"kubernetes.io/projected/55f18481-391a-4e73-880c-f1d058ac7564-kube-api-access-67gdw\") pod \"55f18481-391a-4e73-880c-f1d058ac7564\" (UID: \"55f18481-391a-4e73-880c-f1d058ac7564\") " Mar 20 16:24:04 crc kubenswrapper[4708]: I0320 16:24:04.184194 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55f18481-391a-4e73-880c-f1d058ac7564-kube-api-access-67gdw" (OuterVolumeSpecName: "kube-api-access-67gdw") pod "55f18481-391a-4e73-880c-f1d058ac7564" (UID: "55f18481-391a-4e73-880c-f1d058ac7564"). InnerVolumeSpecName "kube-api-access-67gdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:04 crc kubenswrapper[4708]: I0320 16:24:04.279411 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67gdw\" (UniqueName: \"kubernetes.io/projected/55f18481-391a-4e73-880c-f1d058ac7564-kube-api-access-67gdw\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:04 crc kubenswrapper[4708]: I0320 16:24:04.641232 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567064-kth9n" event={"ID":"55f18481-391a-4e73-880c-f1d058ac7564","Type":"ContainerDied","Data":"1f01617d5ffafd021515b061e9e4ec5fcd73dcc668b617995f042deec82000f6"} Mar 20 16:24:04 crc kubenswrapper[4708]: I0320 16:24:04.641283 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-kth9n" Mar 20 16:24:04 crc kubenswrapper[4708]: I0320 16:24:04.641296 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f01617d5ffafd021515b061e9e4ec5fcd73dcc668b617995f042deec82000f6" Mar 20 16:24:04 crc kubenswrapper[4708]: I0320 16:24:04.641909 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:24:05 crc kubenswrapper[4708]: I0320 16:24:05.186804 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-s2vqj"] Mar 20 16:24:05 crc kubenswrapper[4708]: I0320 16:24:05.199659 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-s2vqj"] Mar 20 16:24:05 crc kubenswrapper[4708]: I0320 16:24:05.895721 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 16:24:05 crc kubenswrapper[4708]: I0320 16:24:05.933089 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 16:24:06 crc kubenswrapper[4708]: I0320 16:24:06.128926 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffb520a6-2a16-4d11-a67c-86ac48535b9a" path="/var/lib/kubelet/pods/ffb520a6-2a16-4d11-a67c-86ac48535b9a/volumes" Mar 20 16:24:06 crc kubenswrapper[4708]: I0320 16:24:06.703861 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 16:24:07 crc kubenswrapper[4708]: I0320 16:24:07.311298 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:24:07 crc kubenswrapper[4708]: I0320 16:24:07.311412 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:24:08 crc kubenswrapper[4708]: I0320 16:24:08.394890 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:08 crc kubenswrapper[4708]: I0320 16:24:08.394901 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:13 crc kubenswrapper[4708]: W0320 16:24:13.367047 4708 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55f18481_391a_4e73_880c_f1d058ac7564.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55f18481_391a_4e73_880c_f1d058ac7564.slice: no such file or directory Mar 20 16:24:13 crc kubenswrapper[4708]: E0320 16:24:13.595918 4708 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddd133b9_300b_403d_a469_58790dee930c.slice/crio-13cbad05c000901dcea7491f96b1f0073385a627c23a0dc556a431c8a31067d2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddd133b9_300b_403d_a469_58790dee930c.slice/crio-conmon-13cbad05c000901dcea7491f96b1f0073385a627c23a0dc556a431c8a31067d2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41c1c7e2_c3b1_4c5f_8134_c4998addfdf4.slice/crio-conmon-3b456a81cb7201f04a2346e2de43280f2de53da90db40ec10ff5b5e2e9c7b99c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf676e7f_5129_436c_9451_8a9b1c8c19c0.slice/crio-84915d182a57c84c0d30885753a625a1d23becd15174c9ff41a9700696654d79\": RecentStats: unable to find data in memory cache]" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.737600 4708 generic.go:334] "Generic (PLEG): container finished" podID="ddd133b9-300b-403d-a469-58790dee930c" containerID="13cbad05c000901dcea7491f96b1f0073385a627c23a0dc556a431c8a31067d2" exitCode=137 Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.737911 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ddd133b9-300b-403d-a469-58790dee930c","Type":"ContainerDied","Data":"13cbad05c000901dcea7491f96b1f0073385a627c23a0dc556a431c8a31067d2"} Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.737948 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ddd133b9-300b-403d-a469-58790dee930c","Type":"ContainerDied","Data":"9bce4e37d21a9eec48cd80497d451eda4b688007ed81c6b810069d529a6cf571"} Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.737961 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bce4e37d21a9eec48cd80497d451eda4b688007ed81c6b810069d529a6cf571" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.748433 4708 generic.go:334] "Generic (PLEG): container finished" podID="41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" containerID="3b456a81cb7201f04a2346e2de43280f2de53da90db40ec10ff5b5e2e9c7b99c" exitCode=137 Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.748499 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4","Type":"ContainerDied","Data":"3b456a81cb7201f04a2346e2de43280f2de53da90db40ec10ff5b5e2e9c7b99c"} Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.748535 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4","Type":"ContainerDied","Data":"ec198aebba5f734e4a6d9ac66df948e2082b2f25c0802146c065a3c4a61d69c9"} Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.748551 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec198aebba5f734e4a6d9ac66df948e2082b2f25c0802146c065a3c4a61d69c9" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.805784 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.815575 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.883619 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-combined-ca-bundle\") pod \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.883841 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-config-data\") pod \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.883885 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-logs\") pod \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.883996 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd133b9-300b-403d-a469-58790dee930c-config-data\") pod \"ddd133b9-300b-403d-a469-58790dee930c\" (UID: \"ddd133b9-300b-403d-a469-58790dee930c\") " Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.884049 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjbx5\" (UniqueName: \"kubernetes.io/projected/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-kube-api-access-vjbx5\") pod \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\" (UID: \"41c1c7e2-c3b1-4c5f-8134-c4998addfdf4\") " Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.884154 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd133b9-300b-403d-a469-58790dee930c-combined-ca-bundle\") pod \"ddd133b9-300b-403d-a469-58790dee930c\" (UID: \"ddd133b9-300b-403d-a469-58790dee930c\") " Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.884281 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm6mk\" (UniqueName: \"kubernetes.io/projected/ddd133b9-300b-403d-a469-58790dee930c-kube-api-access-tm6mk\") pod \"ddd133b9-300b-403d-a469-58790dee930c\" (UID: \"ddd133b9-300b-403d-a469-58790dee930c\") " Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.884296 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-logs" (OuterVolumeSpecName: "logs") pod "41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" (UID: "41c1c7e2-c3b1-4c5f-8134-c4998addfdf4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.884919 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.890987 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-kube-api-access-vjbx5" (OuterVolumeSpecName: "kube-api-access-vjbx5") pod "41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" (UID: "41c1c7e2-c3b1-4c5f-8134-c4998addfdf4"). InnerVolumeSpecName "kube-api-access-vjbx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.897534 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd133b9-300b-403d-a469-58790dee930c-kube-api-access-tm6mk" (OuterVolumeSpecName: "kube-api-access-tm6mk") pod "ddd133b9-300b-403d-a469-58790dee930c" (UID: "ddd133b9-300b-403d-a469-58790dee930c"). InnerVolumeSpecName "kube-api-access-tm6mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.914114 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-config-data" (OuterVolumeSpecName: "config-data") pod "41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" (UID: "41c1c7e2-c3b1-4c5f-8134-c4998addfdf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.914750 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" (UID: "41c1c7e2-c3b1-4c5f-8134-c4998addfdf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.918810 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd133b9-300b-403d-a469-58790dee930c-config-data" (OuterVolumeSpecName: "config-data") pod "ddd133b9-300b-403d-a469-58790dee930c" (UID: "ddd133b9-300b-403d-a469-58790dee930c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.920542 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd133b9-300b-403d-a469-58790dee930c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddd133b9-300b-403d-a469-58790dee930c" (UID: "ddd133b9-300b-403d-a469-58790dee930c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.986849 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd133b9-300b-403d-a469-58790dee930c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.986886 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjbx5\" (UniqueName: \"kubernetes.io/projected/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-kube-api-access-vjbx5\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.986899 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd133b9-300b-403d-a469-58790dee930c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.986908 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm6mk\" (UniqueName: \"kubernetes.io/projected/ddd133b9-300b-403d-a469-58790dee930c-kube-api-access-tm6mk\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.986916 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:13 crc kubenswrapper[4708]: I0320 16:24:13.986927 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.764927 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.765270 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.797948 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.824304 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.841737 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.852753 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.877309 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:24:14 crc kubenswrapper[4708]: E0320 16:24:14.878081 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd133b9-300b-403d-a469-58790dee930c" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.878213 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd133b9-300b-403d-a469-58790dee930c" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 16:24:14 crc kubenswrapper[4708]: E0320 16:24:14.878303 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55f18481-391a-4e73-880c-f1d058ac7564" containerName="oc" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.878376 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="55f18481-391a-4e73-880c-f1d058ac7564" containerName="oc" Mar 20 16:24:14 crc kubenswrapper[4708]: E0320 16:24:14.878449 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" containerName="nova-metadata-log" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.878521 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" containerName="nova-metadata-log" Mar 20 16:24:14 crc kubenswrapper[4708]: E0320 16:24:14.878591 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" containerName="nova-metadata-metadata" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.878651 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" containerName="nova-metadata-metadata" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.878991 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" containerName="nova-metadata-metadata" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.879088 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd133b9-300b-403d-a469-58790dee930c" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.879358 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="55f18481-391a-4e73-880c-f1d058ac7564" containerName="oc" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.879427 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" containerName="nova-metadata-log" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.880400 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.883861 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.883907 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.884132 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.893880 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.895815 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.900472 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.900643 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.906534 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:24:14 crc kubenswrapper[4708]: I0320 16:24:14.920370 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.010192 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e151c63-d78f-4799-9489-09c4d91cb4ab-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.010452 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-config-data\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.010554 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtr5k\" (UniqueName: \"kubernetes.io/projected/2659505d-fddd-42b2-a820-21fd1bac6479-kube-api-access-jtr5k\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.010660 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e151c63-d78f-4799-9489-09c4d91cb4ab-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.010826 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e151c63-d78f-4799-9489-09c4d91cb4ab-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.010865 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5fb\" (UniqueName: \"kubernetes.io/projected/0e151c63-d78f-4799-9489-09c4d91cb4ab-kube-api-access-hr5fb\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.010940 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.010994 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e151c63-d78f-4799-9489-09c4d91cb4ab-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.011124 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.011237 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2659505d-fddd-42b2-a820-21fd1bac6479-logs\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.113218 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e151c63-d78f-4799-9489-09c4d91cb4ab-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.113276 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5fb\" (UniqueName: \"kubernetes.io/projected/0e151c63-d78f-4799-9489-09c4d91cb4ab-kube-api-access-hr5fb\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.113298 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.113325 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e151c63-d78f-4799-9489-09c4d91cb4ab-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.113368 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.113417 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2659505d-fddd-42b2-a820-21fd1bac6479-logs\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.113468 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e151c63-d78f-4799-9489-09c4d91cb4ab-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.113512 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-config-data\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.113535 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtr5k\" (UniqueName: \"kubernetes.io/projected/2659505d-fddd-42b2-a820-21fd1bac6479-kube-api-access-jtr5k\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.113586 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e151c63-d78f-4799-9489-09c4d91cb4ab-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.114123 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2659505d-fddd-42b2-a820-21fd1bac6479-logs\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.119904 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-config-data\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.119845 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e151c63-d78f-4799-9489-09c4d91cb4ab-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.120901 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e151c63-d78f-4799-9489-09c4d91cb4ab-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.121378 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.121422 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e151c63-d78f-4799-9489-09c4d91cb4ab-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.123741 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e151c63-d78f-4799-9489-09c4d91cb4ab-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.123791 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.132837 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5fb\" (UniqueName: \"kubernetes.io/projected/0e151c63-d78f-4799-9489-09c4d91cb4ab-kube-api-access-hr5fb\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e151c63-d78f-4799-9489-09c4d91cb4ab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.133968 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtr5k\" (UniqueName: \"kubernetes.io/projected/2659505d-fddd-42b2-a820-21fd1bac6479-kube-api-access-jtr5k\") pod \"nova-metadata-0\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.210589 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.226454 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.311985 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.312030 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.710725 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:15 crc kubenswrapper[4708]: W0320 16:24:15.711791 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2659505d_fddd_42b2_a820_21fd1bac6479.slice/crio-79abddf9ba44fe01b559f702ef729ac7e79b32918d88e269286f3abf174ba9a4 WatchSource:0}: Error finding container 79abddf9ba44fe01b559f702ef729ac7e79b32918d88e269286f3abf174ba9a4: Status 404 returned error can't find the container with id 79abddf9ba44fe01b559f702ef729ac7e79b32918d88e269286f3abf174ba9a4 Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.784600 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2659505d-fddd-42b2-a820-21fd1bac6479","Type":"ContainerStarted","Data":"79abddf9ba44fe01b559f702ef729ac7e79b32918d88e269286f3abf174ba9a4"} Mar 20 16:24:15 crc kubenswrapper[4708]: W0320 16:24:15.793841 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e151c63_d78f_4799_9489_09c4d91cb4ab.slice/crio-44a257d000fede69fb4fc0df950045aa8178618c02c3dfac412a54c9c9aa8d93 WatchSource:0}: Error finding container 44a257d000fede69fb4fc0df950045aa8178618c02c3dfac412a54c9c9aa8d93: Status 404 returned error can't find the container with id 44a257d000fede69fb4fc0df950045aa8178618c02c3dfac412a54c9c9aa8d93 Mar 20 16:24:15 crc kubenswrapper[4708]: I0320 16:24:15.795523 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 16:24:16 crc kubenswrapper[4708]: I0320 16:24:16.123904 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c1c7e2-c3b1-4c5f-8134-c4998addfdf4" path="/var/lib/kubelet/pods/41c1c7e2-c3b1-4c5f-8134-c4998addfdf4/volumes" Mar 20 16:24:16 crc kubenswrapper[4708]: I0320 16:24:16.125292 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd133b9-300b-403d-a469-58790dee930c" path="/var/lib/kubelet/pods/ddd133b9-300b-403d-a469-58790dee930c/volumes" Mar 20 16:24:16 crc kubenswrapper[4708]: I0320 16:24:16.735394 4708 scope.go:117] "RemoveContainer" containerID="d92bd51b48fc4d2e903cf8ff8703ab1c2c6b5997c2e523091504d71742c0e445" Mar 20 16:24:16 crc kubenswrapper[4708]: I0320 16:24:16.795303 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2659505d-fddd-42b2-a820-21fd1bac6479","Type":"ContainerStarted","Data":"392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def"} Mar 20 16:24:16 crc kubenswrapper[4708]: I0320 16:24:16.795348 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2659505d-fddd-42b2-a820-21fd1bac6479","Type":"ContainerStarted","Data":"ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2"} Mar 20 16:24:16 crc kubenswrapper[4708]: I0320 16:24:16.798078 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0e151c63-d78f-4799-9489-09c4d91cb4ab","Type":"ContainerStarted","Data":"e01496f18a5015f8033d3139554435992d236902e59f19929e3bdd57bd143533"} Mar 20 16:24:16 crc kubenswrapper[4708]: I0320 16:24:16.798154 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0e151c63-d78f-4799-9489-09c4d91cb4ab","Type":"ContainerStarted","Data":"44a257d000fede69fb4fc0df950045aa8178618c02c3dfac412a54c9c9aa8d93"} Mar 20 16:24:16 crc kubenswrapper[4708]: I0320 16:24:16.870026 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.870003387 podStartE2EDuration="2.870003387s" podCreationTimestamp="2026-03-20 16:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:16.81564353 +0000 UTC m=+1411.489980245" watchObservedRunningTime="2026-03-20 16:24:16.870003387 +0000 UTC m=+1411.544340102" Mar 20 16:24:16 crc kubenswrapper[4708]: I0320 16:24:16.878504 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.878478727 podStartE2EDuration="2.878478727s" podCreationTimestamp="2026-03-20 16:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:16.841569534 +0000 UTC m=+1411.515906249" watchObservedRunningTime="2026-03-20 16:24:16.878478727 +0000 UTC m=+1411.552815442" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.316108 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.316183 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.318711 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.319918 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.657434 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-mkqbz"] Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.659387 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.684151 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-mkqbz"] Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.768238 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.768825 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.768916 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8lqj\" (UniqueName: \"kubernetes.io/projected/263a49a4-a2a6-4d75-82b5-cb508abfe752-kube-api-access-q8lqj\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.769407 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.769563 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.769726 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-config\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.871875 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.871958 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.872011 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-config\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.872040 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.872058 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.872155 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8lqj\" (UniqueName: \"kubernetes.io/projected/263a49a4-a2a6-4d75-82b5-cb508abfe752-kube-api-access-q8lqj\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.872945 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.872945 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-config\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.873264 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.873540 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.873580 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/263a49a4-a2a6-4d75-82b5-cb508abfe752-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:17 crc kubenswrapper[4708]: I0320 16:24:17.891625 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8lqj\" (UniqueName: \"kubernetes.io/projected/263a49a4-a2a6-4d75-82b5-cb508abfe752-kube-api-access-q8lqj\") pod \"dnsmasq-dns-89c5cd4d5-mkqbz\" (UID: \"263a49a4-a2a6-4d75-82b5-cb508abfe752\") " pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:18 crc kubenswrapper[4708]: I0320 16:24:18.300997 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:18 crc kubenswrapper[4708]: I0320 16:24:18.782296 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-mkqbz"] Mar 20 16:24:18 crc kubenswrapper[4708]: I0320 16:24:18.827493 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" event={"ID":"263a49a4-a2a6-4d75-82b5-cb508abfe752","Type":"ContainerStarted","Data":"f1fb1c9931a91c5551ceaccd875ed3962898b4879b2bea006b473406c4cb6896"} Mar 20 16:24:19 crc kubenswrapper[4708]: I0320 16:24:19.763934 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:19 crc kubenswrapper[4708]: I0320 16:24:19.764753 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="ceilometer-central-agent" containerID="cri-o://ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68" gracePeriod=30 Mar 20 16:24:19 crc kubenswrapper[4708]: I0320 16:24:19.764824 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="ceilometer-notification-agent" containerID="cri-o://c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682" gracePeriod=30 Mar 20 16:24:19 crc kubenswrapper[4708]: I0320 16:24:19.764864 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="sg-core" containerID="cri-o://9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23" gracePeriod=30 Mar 20 16:24:19 crc kubenswrapper[4708]: I0320 16:24:19.764906 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="proxy-httpd" containerID="cri-o://ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2" gracePeriod=30 Mar 20 16:24:19 crc kubenswrapper[4708]: I0320 16:24:19.777035 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.202:3000/\": EOF" Mar 20 16:24:19 crc kubenswrapper[4708]: I0320 16:24:19.838448 4708 generic.go:334] "Generic (PLEG): container finished" podID="263a49a4-a2a6-4d75-82b5-cb508abfe752" containerID="e184bce7afd41cc9afe4e059c1816f5beb4aa403304527d49caa7effb559d508" exitCode=0 Mar 20 16:24:19 crc kubenswrapper[4708]: I0320 16:24:19.838496 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" event={"ID":"263a49a4-a2a6-4d75-82b5-cb508abfe752","Type":"ContainerDied","Data":"e184bce7afd41cc9afe4e059c1816f5beb4aa403304527d49caa7effb559d508"} Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.062436 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.063030 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" containerName="nova-api-log" containerID="cri-o://1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc" gracePeriod=30 Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.063076 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" containerName="nova-api-api" containerID="cri-o://a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba" gracePeriod=30 Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.211592 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.662592 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.849716 4708 generic.go:334] "Generic (PLEG): container finished" podID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" containerID="1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc" exitCode=143 Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.849778 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67ea6ed7-5874-45db-94f4-b77cff6ddc40","Type":"ContainerDied","Data":"1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc"} Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.851984 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-ceilometer-tls-certs\") pod \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.852143 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5c28799-921e-4f2a-8b33-2025ddfe4b90-log-httpd\") pod \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.852432 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcxxq\" (UniqueName: \"kubernetes.io/projected/c5c28799-921e-4f2a-8b33-2025ddfe4b90-kube-api-access-jcxxq\") pod \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.852483 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c28799-921e-4f2a-8b33-2025ddfe4b90-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c5c28799-921e-4f2a-8b33-2025ddfe4b90" (UID: "c5c28799-921e-4f2a-8b33-2025ddfe4b90"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.852571 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-combined-ca-bundle\") pod \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.852638 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-sg-core-conf-yaml\") pod \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.852920 4708 generic.go:334] "Generic (PLEG): container finished" podID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerID="ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2" exitCode=0 Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.852974 4708 generic.go:334] "Generic (PLEG): container finished" podID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerID="9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23" exitCode=2 Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.852985 4708 generic.go:334] "Generic (PLEG): container finished" podID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerID="c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682" exitCode=0 Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.852994 4708 generic.go:334] "Generic (PLEG): container finished" podID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerID="ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68" exitCode=0 Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.852977 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5c28799-921e-4f2a-8b33-2025ddfe4b90","Type":"ContainerDied","Data":"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2"} Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.853036 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5c28799-921e-4f2a-8b33-2025ddfe4b90","Type":"ContainerDied","Data":"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23"} Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.853030 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.853057 4708 scope.go:117] "RemoveContainer" containerID="ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.853046 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5c28799-921e-4f2a-8b33-2025ddfe4b90","Type":"ContainerDied","Data":"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682"} Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.853160 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5c28799-921e-4f2a-8b33-2025ddfe4b90","Type":"ContainerDied","Data":"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68"} Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.853187 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5c28799-921e-4f2a-8b33-2025ddfe4b90","Type":"ContainerDied","Data":"738542e3e5a4a3f528995220e42ac9cff552895fd966f6d31d7c0cace5ccdc80"} Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.854243 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-scripts\") pod \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.854390 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5c28799-921e-4f2a-8b33-2025ddfe4b90-run-httpd\") pod \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.856752 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c28799-921e-4f2a-8b33-2025ddfe4b90-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c5c28799-921e-4f2a-8b33-2025ddfe4b90" (UID: "c5c28799-921e-4f2a-8b33-2025ddfe4b90"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.856825 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-config-data\") pod \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\" (UID: \"c5c28799-921e-4f2a-8b33-2025ddfe4b90\") " Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.857808 4708 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5c28799-921e-4f2a-8b33-2025ddfe4b90-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.857839 4708 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5c28799-921e-4f2a-8b33-2025ddfe4b90-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.860164 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-scripts" (OuterVolumeSpecName: "scripts") pod "c5c28799-921e-4f2a-8b33-2025ddfe4b90" (UID: "c5c28799-921e-4f2a-8b33-2025ddfe4b90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.860967 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c28799-921e-4f2a-8b33-2025ddfe4b90-kube-api-access-jcxxq" (OuterVolumeSpecName: "kube-api-access-jcxxq") pod "c5c28799-921e-4f2a-8b33-2025ddfe4b90" (UID: "c5c28799-921e-4f2a-8b33-2025ddfe4b90"). InnerVolumeSpecName "kube-api-access-jcxxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.863606 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" event={"ID":"263a49a4-a2a6-4d75-82b5-cb508abfe752","Type":"ContainerStarted","Data":"ddefe7d9dbd0578165674f269814fe441ccf541511dd97551f4bbfdcd4bda5cd"} Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.865074 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.877889 4708 scope.go:117] "RemoveContainer" containerID="9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.886620 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" podStartSLOduration=3.886598482 podStartE2EDuration="3.886598482s" podCreationTimestamp="2026-03-20 16:24:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:20.884001512 +0000 UTC m=+1415.558338247" watchObservedRunningTime="2026-03-20 16:24:20.886598482 +0000 UTC m=+1415.560935197" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.895035 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c5c28799-921e-4f2a-8b33-2025ddfe4b90" (UID: "c5c28799-921e-4f2a-8b33-2025ddfe4b90"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.907383 4708 scope.go:117] "RemoveContainer" containerID="c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.921184 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c5c28799-921e-4f2a-8b33-2025ddfe4b90" (UID: "c5c28799-921e-4f2a-8b33-2025ddfe4b90"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.943581 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5c28799-921e-4f2a-8b33-2025ddfe4b90" (UID: "c5c28799-921e-4f2a-8b33-2025ddfe4b90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.959253 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.959291 4708 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.959304 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.959315 4708 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:20 crc kubenswrapper[4708]: I0320 16:24:20.959324 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcxxq\" (UniqueName: \"kubernetes.io/projected/c5c28799-921e-4f2a-8b33-2025ddfe4b90-kube-api-access-jcxxq\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.010828 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-config-data" (OuterVolumeSpecName: "config-data") pod "c5c28799-921e-4f2a-8b33-2025ddfe4b90" (UID: "c5c28799-921e-4f2a-8b33-2025ddfe4b90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.061230 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5c28799-921e-4f2a-8b33-2025ddfe4b90-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.109474 4708 scope.go:117] "RemoveContainer" containerID="ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.146044 4708 scope.go:117] "RemoveContainer" containerID="ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2" Mar 20 16:24:21 crc kubenswrapper[4708]: E0320 16:24:21.146664 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2\": container with ID starting with ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2 not found: ID does not exist" containerID="ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.146743 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2"} err="failed to get container status \"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2\": rpc error: code = NotFound desc = could not find container \"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2\": container with ID starting with ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.146785 4708 scope.go:117] "RemoveContainer" containerID="9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23" Mar 20 16:24:21 crc kubenswrapper[4708]: E0320 16:24:21.147282 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23\": container with ID starting with 9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23 not found: ID does not exist" containerID="9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.147314 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23"} err="failed to get container status \"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23\": rpc error: code = NotFound desc = could not find container \"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23\": container with ID starting with 9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.147338 4708 scope.go:117] "RemoveContainer" containerID="c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682" Mar 20 16:24:21 crc kubenswrapper[4708]: E0320 16:24:21.150981 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682\": container with ID starting with c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682 not found: ID does not exist" containerID="c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.151008 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682"} err="failed to get container status \"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682\": rpc error: code = NotFound desc = could not find container \"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682\": container with ID starting with c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.151026 4708 scope.go:117] "RemoveContainer" containerID="ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68" Mar 20 16:24:21 crc kubenswrapper[4708]: E0320 16:24:21.151428 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68\": container with ID starting with ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68 not found: ID does not exist" containerID="ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.151449 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68"} err="failed to get container status \"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68\": rpc error: code = NotFound desc = could not find container \"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68\": container with ID starting with ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.151461 4708 scope.go:117] "RemoveContainer" containerID="ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.151688 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2"} err="failed to get container status \"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2\": rpc error: code = NotFound desc = could not find container \"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2\": container with ID starting with ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.151707 4708 scope.go:117] "RemoveContainer" containerID="9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.151907 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23"} err="failed to get container status \"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23\": rpc error: code = NotFound desc = could not find container \"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23\": container with ID starting with 9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.151927 4708 scope.go:117] "RemoveContainer" containerID="c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.152076 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682"} err="failed to get container status \"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682\": rpc error: code = NotFound desc = could not find container \"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682\": container with ID starting with c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.152094 4708 scope.go:117] "RemoveContainer" containerID="ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.152332 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68"} err="failed to get container status \"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68\": rpc error: code = NotFound desc = could not find container \"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68\": container with ID starting with ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.152351 4708 scope.go:117] "RemoveContainer" containerID="ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.152759 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2"} err="failed to get container status \"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2\": rpc error: code = NotFound desc = could not find container \"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2\": container with ID starting with ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.152780 4708 scope.go:117] "RemoveContainer" containerID="9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.152972 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23"} err="failed to get container status \"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23\": rpc error: code = NotFound desc = could not find container \"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23\": container with ID starting with 9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.152999 4708 scope.go:117] "RemoveContainer" containerID="c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.153229 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682"} err="failed to get container status \"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682\": rpc error: code = NotFound desc = could not find container \"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682\": container with ID starting with c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.153250 4708 scope.go:117] "RemoveContainer" containerID="ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.161783 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68"} err="failed to get container status \"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68\": rpc error: code = NotFound desc = could not find container \"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68\": container with ID starting with ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.161825 4708 scope.go:117] "RemoveContainer" containerID="ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.172423 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2"} err="failed to get container status \"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2\": rpc error: code = NotFound desc = could not find container \"ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2\": container with ID starting with ff326045cb6961148ce44f08837ce09c43aa3c6b96bf62dc0a38e35b6fe7e2f2 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.172461 4708 scope.go:117] "RemoveContainer" containerID="9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.172943 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23"} err="failed to get container status \"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23\": rpc error: code = NotFound desc = could not find container \"9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23\": container with ID starting with 9736280a6d4a1c2e582c19a4ac993d1971dc427fb1fc2353763bee0ccf762b23 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.172969 4708 scope.go:117] "RemoveContainer" containerID="c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.173203 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682"} err="failed to get container status \"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682\": rpc error: code = NotFound desc = could not find container \"c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682\": container with ID starting with c7fbdd5bdd2c3257574d1c854a4c1653458c0cf7e60c2bac82d2efffe7557682 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.173225 4708 scope.go:117] "RemoveContainer" containerID="ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.174157 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68"} err="failed to get container status \"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68\": rpc error: code = NotFound desc = could not find container \"ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68\": container with ID starting with ebe9da639e76e980a7e00c9c3a45ee1ed176872834455080ce410a39b2a3bc68 not found: ID does not exist" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.187273 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.211002 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.236786 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:21 crc kubenswrapper[4708]: E0320 16:24:21.237288 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="sg-core" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.237312 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="sg-core" Mar 20 16:24:21 crc kubenswrapper[4708]: E0320 16:24:21.237328 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="proxy-httpd" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.237336 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="proxy-httpd" Mar 20 16:24:21 crc kubenswrapper[4708]: E0320 16:24:21.237349 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="ceilometer-central-agent" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.237358 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="ceilometer-central-agent" Mar 20 16:24:21 crc kubenswrapper[4708]: E0320 16:24:21.237384 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="ceilometer-notification-agent" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.237391 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="ceilometer-notification-agent" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.237811 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="ceilometer-notification-agent" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.237836 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="proxy-httpd" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.237850 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="ceilometer-central-agent" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.237879 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" containerName="sg-core" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.242135 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.245806 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.246038 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.250101 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.258567 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.369390 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.369479 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-scripts\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.369533 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hd6d\" (UniqueName: \"kubernetes.io/projected/ee996747-dcce-47bb-b6b1-493c9e40f6c5-kube-api-access-9hd6d\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.369553 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-config-data\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.369579 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee996747-dcce-47bb-b6b1-493c9e40f6c5-log-httpd\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.369913 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.370005 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee996747-dcce-47bb-b6b1-493c9e40f6c5-run-httpd\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.370183 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.471948 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.472030 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.472069 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-scripts\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.472103 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hd6d\" (UniqueName: \"kubernetes.io/projected/ee996747-dcce-47bb-b6b1-493c9e40f6c5-kube-api-access-9hd6d\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.472120 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-config-data\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.472138 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee996747-dcce-47bb-b6b1-493c9e40f6c5-log-httpd\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.472186 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.472214 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee996747-dcce-47bb-b6b1-493c9e40f6c5-run-httpd\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.472746 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee996747-dcce-47bb-b6b1-493c9e40f6c5-run-httpd\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.474105 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee996747-dcce-47bb-b6b1-493c9e40f6c5-log-httpd\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.477518 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-scripts\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.478132 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.478346 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.479346 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.493828 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-config-data\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.498485 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hd6d\" (UniqueName: \"kubernetes.io/projected/ee996747-dcce-47bb-b6b1-493c9e40f6c5-kube-api-access-9hd6d\") pod \"ceilometer-0\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.562150 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:21 crc kubenswrapper[4708]: I0320 16:24:21.980368 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:22 crc kubenswrapper[4708]: I0320 16:24:22.037430 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:22 crc kubenswrapper[4708]: I0320 16:24:22.123470 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5c28799-921e-4f2a-8b33-2025ddfe4b90" path="/var/lib/kubelet/pods/c5c28799-921e-4f2a-8b33-2025ddfe4b90/volumes" Mar 20 16:24:22 crc kubenswrapper[4708]: I0320 16:24:22.883517 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee996747-dcce-47bb-b6b1-493c9e40f6c5","Type":"ContainerStarted","Data":"d8b6e05de4156428cf88ae6c653377dd1c342b5a90c7bbe7998f694d60c7517b"} Mar 20 16:24:22 crc kubenswrapper[4708]: I0320 16:24:22.883916 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee996747-dcce-47bb-b6b1-493c9e40f6c5","Type":"ContainerStarted","Data":"0751d713423c619f1c3d9bd9a65e64daebfb9ce3e948ff52a895927a0ba22762"} Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.672280 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.731331 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ea6ed7-5874-45db-94f4-b77cff6ddc40-config-data\") pod \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.731451 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67ea6ed7-5874-45db-94f4-b77cff6ddc40-logs\") pod \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.731480 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ea6ed7-5874-45db-94f4-b77cff6ddc40-combined-ca-bundle\") pod \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.731520 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7dh9\" (UniqueName: \"kubernetes.io/projected/67ea6ed7-5874-45db-94f4-b77cff6ddc40-kube-api-access-j7dh9\") pod \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\" (UID: \"67ea6ed7-5874-45db-94f4-b77cff6ddc40\") " Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.732226 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67ea6ed7-5874-45db-94f4-b77cff6ddc40-logs" (OuterVolumeSpecName: "logs") pod "67ea6ed7-5874-45db-94f4-b77cff6ddc40" (UID: "67ea6ed7-5874-45db-94f4-b77cff6ddc40"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.736222 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67ea6ed7-5874-45db-94f4-b77cff6ddc40-kube-api-access-j7dh9" (OuterVolumeSpecName: "kube-api-access-j7dh9") pod "67ea6ed7-5874-45db-94f4-b77cff6ddc40" (UID: "67ea6ed7-5874-45db-94f4-b77cff6ddc40"). InnerVolumeSpecName "kube-api-access-j7dh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.766806 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ea6ed7-5874-45db-94f4-b77cff6ddc40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67ea6ed7-5874-45db-94f4-b77cff6ddc40" (UID: "67ea6ed7-5874-45db-94f4-b77cff6ddc40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.783805 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ea6ed7-5874-45db-94f4-b77cff6ddc40-config-data" (OuterVolumeSpecName: "config-data") pod "67ea6ed7-5874-45db-94f4-b77cff6ddc40" (UID: "67ea6ed7-5874-45db-94f4-b77cff6ddc40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.835012 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ea6ed7-5874-45db-94f4-b77cff6ddc40-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.835054 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67ea6ed7-5874-45db-94f4-b77cff6ddc40-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.835071 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ea6ed7-5874-45db-94f4-b77cff6ddc40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.835085 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7dh9\" (UniqueName: \"kubernetes.io/projected/67ea6ed7-5874-45db-94f4-b77cff6ddc40-kube-api-access-j7dh9\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:23 crc kubenswrapper[4708]: E0320 16:24:23.895213 4708 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf676e7f_5129_436c_9451_8a9b1c8c19c0.slice/crio-84915d182a57c84c0d30885753a625a1d23becd15174c9ff41a9700696654d79\": RecentStats: unable to find data in memory cache]" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.917914 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee996747-dcce-47bb-b6b1-493c9e40f6c5","Type":"ContainerStarted","Data":"52974e68528b934c0657c4a394b653aa2303c686bfce44cc58c61617a0240c9e"} Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.920609 4708 generic.go:334] "Generic (PLEG): container finished" podID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" containerID="a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba" exitCode=0 Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.920644 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67ea6ed7-5874-45db-94f4-b77cff6ddc40","Type":"ContainerDied","Data":"a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba"} Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.920662 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67ea6ed7-5874-45db-94f4-b77cff6ddc40","Type":"ContainerDied","Data":"4a0b653406c25eb160e02d01e34772a83f16fa67e7aab03bdd17640c53defcbf"} Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.920691 4708 scope.go:117] "RemoveContainer" containerID="a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.920829 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.949925 4708 scope.go:117] "RemoveContainer" containerID="1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.979811 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.987946 4708 scope.go:117] "RemoveContainer" containerID="a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba" Mar 20 16:24:23 crc kubenswrapper[4708]: E0320 16:24:23.988492 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba\": container with ID starting with a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba not found: ID does not exist" containerID="a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.988539 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba"} err="failed to get container status \"a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba\": rpc error: code = NotFound desc = could not find container \"a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba\": container with ID starting with a630a4ffee91fd9ca9ba3d7a192ba0eaa920bd3ae72edceef530419a30c093ba not found: ID does not exist" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.988569 4708 scope.go:117] "RemoveContainer" containerID="1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc" Mar 20 16:24:23 crc kubenswrapper[4708]: E0320 16:24:23.988911 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc\": container with ID starting with 1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc not found: ID does not exist" containerID="1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.988956 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc"} err="failed to get container status \"1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc\": rpc error: code = NotFound desc = could not find container \"1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc\": container with ID starting with 1e2cc4f2c1fc99d7474aaf73753c374351427c6a89628458ab88966de2d2aecc not found: ID does not exist" Mar 20 16:24:23 crc kubenswrapper[4708]: I0320 16:24:23.992979 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.007403 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:24 crc kubenswrapper[4708]: E0320 16:24:24.008122 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" containerName="nova-api-log" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.008152 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" containerName="nova-api-log" Mar 20 16:24:24 crc kubenswrapper[4708]: E0320 16:24:24.008184 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" containerName="nova-api-api" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.008194 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" containerName="nova-api-api" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.008477 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" containerName="nova-api-log" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.008514 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" containerName="nova-api-api" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.011628 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.011887 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.016957 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.017290 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.017419 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.044354 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-public-tls-certs\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.044475 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.044528 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22397977-15ae-4789-8a19-613b16c79fb6-logs\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.044582 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-488nn\" (UniqueName: \"kubernetes.io/projected/22397977-15ae-4789-8a19-613b16c79fb6-kube-api-access-488nn\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.044614 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.044657 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-config-data\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.132551 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67ea6ed7-5874-45db-94f4-b77cff6ddc40" path="/var/lib/kubelet/pods/67ea6ed7-5874-45db-94f4-b77cff6ddc40/volumes" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.150195 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-public-tls-certs\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.150322 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.150377 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22397977-15ae-4789-8a19-613b16c79fb6-logs\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.150427 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-488nn\" (UniqueName: \"kubernetes.io/projected/22397977-15ae-4789-8a19-613b16c79fb6-kube-api-access-488nn\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.150471 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.150506 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-config-data\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.151068 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22397977-15ae-4789-8a19-613b16c79fb6-logs\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.163416 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-public-tls-certs\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.163416 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-config-data\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.178356 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-488nn\" (UniqueName: \"kubernetes.io/projected/22397977-15ae-4789-8a19-613b16c79fb6-kube-api-access-488nn\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.195123 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.195912 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.336436 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.840374 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.952331 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee996747-dcce-47bb-b6b1-493c9e40f6c5","Type":"ContainerStarted","Data":"b690b434a7e465dbdbf01c10fa084459978167ea296a6070634977202fb98b8b"} Mar 20 16:24:24 crc kubenswrapper[4708]: I0320 16:24:24.959050 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22397977-15ae-4789-8a19-613b16c79fb6","Type":"ContainerStarted","Data":"073a669aa4d6a2243ec33d57eb3da36cde2aeee5dbb8f17b4a03050dbe496838"} Mar 20 16:24:25 crc kubenswrapper[4708]: I0320 16:24:25.211544 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4708]: I0320 16:24:25.230651 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4708]: I0320 16:24:25.230752 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 16:24:25 crc kubenswrapper[4708]: I0320 16:24:25.234257 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:25 crc kubenswrapper[4708]: I0320 16:24:25.981113 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22397977-15ae-4789-8a19-613b16c79fb6","Type":"ContainerStarted","Data":"f5d844b29fa2ec17ac8e9eb25fea41a9b99ee25895f99a4f9891026e38e9cf75"} Mar 20 16:24:25 crc kubenswrapper[4708]: I0320 16:24:25.981465 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22397977-15ae-4789-8a19-613b16c79fb6","Type":"ContainerStarted","Data":"229e8472d38db4b9e53d22a48ee4fd5606a52ed4c5b741cfc1a769a9e9dd728b"} Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.021030 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.022825 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.02281002 podStartE2EDuration="3.02281002s" podCreationTimestamp="2026-03-20 16:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:26.008218923 +0000 UTC m=+1420.682555638" watchObservedRunningTime="2026-03-20 16:24:26.02281002 +0000 UTC m=+1420.697146735" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.181009 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.181204 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.181258 4708 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.182088 4708 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1802f9b5863d50f9b0059b425a0ca397c3796b016f71b5db8c43776fd2853ecb"} pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.182143 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" containerID="cri-o://1802f9b5863d50f9b0059b425a0ca397c3796b016f71b5db8c43776fd2853ecb" gracePeriod=600 Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.193230 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-nktb8"] Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.194612 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.200357 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.201203 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.208033 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-nktb8"] Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.239883 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2659505d-fddd-42b2-a820-21fd1bac6479" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.239869 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2659505d-fddd-42b2-a820-21fd1bac6479" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.311467 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-config-data\") pod \"nova-cell1-cell-mapping-nktb8\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.311540 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-scripts\") pod \"nova-cell1-cell-mapping-nktb8\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.311595 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nktb8\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.311797 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nx8v\" (UniqueName: \"kubernetes.io/projected/243ed677-1e35-40c7-ba55-a90eb9c7b85c-kube-api-access-6nx8v\") pod \"nova-cell1-cell-mapping-nktb8\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.412975 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-config-data\") pod \"nova-cell1-cell-mapping-nktb8\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.413318 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-scripts\") pod \"nova-cell1-cell-mapping-nktb8\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.413374 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nktb8\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.413446 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nx8v\" (UniqueName: \"kubernetes.io/projected/243ed677-1e35-40c7-ba55-a90eb9c7b85c-kube-api-access-6nx8v\") pod \"nova-cell1-cell-mapping-nktb8\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.417976 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nktb8\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.418141 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-config-data\") pod \"nova-cell1-cell-mapping-nktb8\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.431879 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-scripts\") pod \"nova-cell1-cell-mapping-nktb8\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.431969 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nx8v\" (UniqueName: \"kubernetes.io/projected/243ed677-1e35-40c7-ba55-a90eb9c7b85c-kube-api-access-6nx8v\") pod \"nova-cell1-cell-mapping-nktb8\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.531433 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.895588 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-nktb8"] Mar 20 16:24:26 crc kubenswrapper[4708]: I0320 16:24:26.999886 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nktb8" event={"ID":"243ed677-1e35-40c7-ba55-a90eb9c7b85c","Type":"ContainerStarted","Data":"ba245bb0ec1b5d2bc4f341c8716077b5b4b776d917f91324615aaf3326023ec2"} Mar 20 16:24:27 crc kubenswrapper[4708]: I0320 16:24:27.003507 4708 generic.go:334] "Generic (PLEG): container finished" podID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerID="1802f9b5863d50f9b0059b425a0ca397c3796b016f71b5db8c43776fd2853ecb" exitCode=0 Mar 20 16:24:27 crc kubenswrapper[4708]: I0320 16:24:27.003614 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerDied","Data":"1802f9b5863d50f9b0059b425a0ca397c3796b016f71b5db8c43776fd2853ecb"} Mar 20 16:24:27 crc kubenswrapper[4708]: I0320 16:24:27.003644 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerStarted","Data":"722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445"} Mar 20 16:24:27 crc kubenswrapper[4708]: I0320 16:24:27.003660 4708 scope.go:117] "RemoveContainer" containerID="a0d736f2e0fcb77d6a0bd7e7c40db6605c442763cf48cb3f0e13e50d606ae696" Mar 20 16:24:27 crc kubenswrapper[4708]: I0320 16:24:27.016472 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="ceilometer-central-agent" containerID="cri-o://d8b6e05de4156428cf88ae6c653377dd1c342b5a90c7bbe7998f694d60c7517b" gracePeriod=30 Mar 20 16:24:27 crc kubenswrapper[4708]: I0320 16:24:27.016558 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee996747-dcce-47bb-b6b1-493c9e40f6c5","Type":"ContainerStarted","Data":"5efce386f39c4466c39cabf75ea11a1c87c076ebfb2f3669506005d0793f0549"} Mar 20 16:24:27 crc kubenswrapper[4708]: I0320 16:24:27.016842 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:24:27 crc kubenswrapper[4708]: I0320 16:24:27.016883 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="sg-core" containerID="cri-o://b690b434a7e465dbdbf01c10fa084459978167ea296a6070634977202fb98b8b" gracePeriod=30 Mar 20 16:24:27 crc kubenswrapper[4708]: I0320 16:24:27.016950 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="proxy-httpd" containerID="cri-o://5efce386f39c4466c39cabf75ea11a1c87c076ebfb2f3669506005d0793f0549" gracePeriod=30 Mar 20 16:24:27 crc kubenswrapper[4708]: I0320 16:24:27.017005 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="ceilometer-notification-agent" containerID="cri-o://52974e68528b934c0657c4a394b653aa2303c686bfce44cc58c61617a0240c9e" gracePeriod=30 Mar 20 16:24:27 crc kubenswrapper[4708]: I0320 16:24:27.089332 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.199481875 podStartE2EDuration="6.089318207s" podCreationTimestamp="2026-03-20 16:24:21 +0000 UTC" firstStartedPulling="2026-03-20 16:24:22.043291001 +0000 UTC m=+1416.717627716" lastFinishedPulling="2026-03-20 16:24:25.933127333 +0000 UTC m=+1420.607464048" observedRunningTime="2026-03-20 16:24:27.085997667 +0000 UTC m=+1421.760334372" watchObservedRunningTime="2026-03-20 16:24:27.089318207 +0000 UTC m=+1421.763654922" Mar 20 16:24:28 crc kubenswrapper[4708]: I0320 16:24:28.026383 4708 generic.go:334] "Generic (PLEG): container finished" podID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerID="5efce386f39c4466c39cabf75ea11a1c87c076ebfb2f3669506005d0793f0549" exitCode=0 Mar 20 16:24:28 crc kubenswrapper[4708]: I0320 16:24:28.026911 4708 generic.go:334] "Generic (PLEG): container finished" podID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerID="b690b434a7e465dbdbf01c10fa084459978167ea296a6070634977202fb98b8b" exitCode=2 Mar 20 16:24:28 crc kubenswrapper[4708]: I0320 16:24:28.026921 4708 generic.go:334] "Generic (PLEG): container finished" podID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerID="52974e68528b934c0657c4a394b653aa2303c686bfce44cc58c61617a0240c9e" exitCode=0 Mar 20 16:24:28 crc kubenswrapper[4708]: I0320 16:24:28.026457 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee996747-dcce-47bb-b6b1-493c9e40f6c5","Type":"ContainerDied","Data":"5efce386f39c4466c39cabf75ea11a1c87c076ebfb2f3669506005d0793f0549"} Mar 20 16:24:28 crc kubenswrapper[4708]: I0320 16:24:28.026991 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee996747-dcce-47bb-b6b1-493c9e40f6c5","Type":"ContainerDied","Data":"b690b434a7e465dbdbf01c10fa084459978167ea296a6070634977202fb98b8b"} Mar 20 16:24:28 crc kubenswrapper[4708]: I0320 16:24:28.027008 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee996747-dcce-47bb-b6b1-493c9e40f6c5","Type":"ContainerDied","Data":"52974e68528b934c0657c4a394b653aa2303c686bfce44cc58c61617a0240c9e"} Mar 20 16:24:28 crc kubenswrapper[4708]: I0320 16:24:28.028808 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nktb8" event={"ID":"243ed677-1e35-40c7-ba55-a90eb9c7b85c","Type":"ContainerStarted","Data":"9fcc1914f821a4ddf6e7373578c1305cfa3ce59e3c6aa9499bf5893c942a03dc"} Mar 20 16:24:28 crc kubenswrapper[4708]: I0320 16:24:28.051269 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-nktb8" podStartSLOduration=2.051247994 podStartE2EDuration="2.051247994s" podCreationTimestamp="2026-03-20 16:24:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:28.041912801 +0000 UTC m=+1422.716249516" watchObservedRunningTime="2026-03-20 16:24:28.051247994 +0000 UTC m=+1422.725584709" Mar 20 16:24:28 crc kubenswrapper[4708]: I0320 16:24:28.307687 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-mkqbz" Mar 20 16:24:28 crc kubenswrapper[4708]: I0320 16:24:28.392456 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7p9zm"] Mar 20 16:24:28 crc kubenswrapper[4708]: I0320 16:24:28.392778 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" podUID="fc8feabe-d295-4e0a-b541-9bfd25628128" containerName="dnsmasq-dns" containerID="cri-o://0063a85d53b4de8c9ef6265aee8aea5aeedd224c1af6af410dfcfbaa21d2e673" gracePeriod=10 Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.052126 4708 generic.go:334] "Generic (PLEG): container finished" podID="fc8feabe-d295-4e0a-b541-9bfd25628128" containerID="0063a85d53b4de8c9ef6265aee8aea5aeedd224c1af6af410dfcfbaa21d2e673" exitCode=0 Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.052261 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" event={"ID":"fc8feabe-d295-4e0a-b541-9bfd25628128","Type":"ContainerDied","Data":"0063a85d53b4de8c9ef6265aee8aea5aeedd224c1af6af410dfcfbaa21d2e673"} Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.052827 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" event={"ID":"fc8feabe-d295-4e0a-b541-9bfd25628128","Type":"ContainerDied","Data":"3e3aeb71c3aa2c2d40861cba57ac5f17ee96f5a78c3e12b32b272182304e080b"} Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.052856 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e3aeb71c3aa2c2d40861cba57ac5f17ee96f5a78c3e12b32b272182304e080b" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.053028 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.076177 4708 generic.go:334] "Generic (PLEG): container finished" podID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerID="d8b6e05de4156428cf88ae6c653377dd1c342b5a90c7bbe7998f694d60c7517b" exitCode=0 Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.077104 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee996747-dcce-47bb-b6b1-493c9e40f6c5","Type":"ContainerDied","Data":"d8b6e05de4156428cf88ae6c653377dd1c342b5a90c7bbe7998f694d60c7517b"} Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.179940 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-dns-svc\") pod \"fc8feabe-d295-4e0a-b541-9bfd25628128\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.180293 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glmb2\" (UniqueName: \"kubernetes.io/projected/fc8feabe-d295-4e0a-b541-9bfd25628128-kube-api-access-glmb2\") pod \"fc8feabe-d295-4e0a-b541-9bfd25628128\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.180538 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-ovsdbserver-sb\") pod \"fc8feabe-d295-4e0a-b541-9bfd25628128\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.180666 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-dns-swift-storage-0\") pod \"fc8feabe-d295-4e0a-b541-9bfd25628128\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.180842 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-config\") pod \"fc8feabe-d295-4e0a-b541-9bfd25628128\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.181018 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-ovsdbserver-nb\") pod \"fc8feabe-d295-4e0a-b541-9bfd25628128\" (UID: \"fc8feabe-d295-4e0a-b541-9bfd25628128\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.207122 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8feabe-d295-4e0a-b541-9bfd25628128-kube-api-access-glmb2" (OuterVolumeSpecName: "kube-api-access-glmb2") pod "fc8feabe-d295-4e0a-b541-9bfd25628128" (UID: "fc8feabe-d295-4e0a-b541-9bfd25628128"). InnerVolumeSpecName "kube-api-access-glmb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.239342 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc8feabe-d295-4e0a-b541-9bfd25628128" (UID: "fc8feabe-d295-4e0a-b541-9bfd25628128"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.242303 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc8feabe-d295-4e0a-b541-9bfd25628128" (UID: "fc8feabe-d295-4e0a-b541-9bfd25628128"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.268733 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-config" (OuterVolumeSpecName: "config") pod "fc8feabe-d295-4e0a-b541-9bfd25628128" (UID: "fc8feabe-d295-4e0a-b541-9bfd25628128"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.270515 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc8feabe-d295-4e0a-b541-9bfd25628128" (UID: "fc8feabe-d295-4e0a-b541-9bfd25628128"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.273282 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc8feabe-d295-4e0a-b541-9bfd25628128" (UID: "fc8feabe-d295-4e0a-b541-9bfd25628128"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.284099 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.284132 4708 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.284143 4708 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-config\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.284152 4708 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.284163 4708 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc8feabe-d295-4e0a-b541-9bfd25628128-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.284171 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glmb2\" (UniqueName: \"kubernetes.io/projected/fc8feabe-d295-4e0a-b541-9bfd25628128-kube-api-access-glmb2\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.301766 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.487632 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-combined-ca-bundle\") pod \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.487765 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-ceilometer-tls-certs\") pod \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.487866 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee996747-dcce-47bb-b6b1-493c9e40f6c5-log-httpd\") pod \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.488036 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-sg-core-conf-yaml\") pod \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.488067 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-config-data\") pod \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.488126 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee996747-dcce-47bb-b6b1-493c9e40f6c5-run-httpd\") pod \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.488167 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hd6d\" (UniqueName: \"kubernetes.io/projected/ee996747-dcce-47bb-b6b1-493c9e40f6c5-kube-api-access-9hd6d\") pod \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.488206 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-scripts\") pod \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\" (UID: \"ee996747-dcce-47bb-b6b1-493c9e40f6c5\") " Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.489592 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee996747-dcce-47bb-b6b1-493c9e40f6c5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ee996747-dcce-47bb-b6b1-493c9e40f6c5" (UID: "ee996747-dcce-47bb-b6b1-493c9e40f6c5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.489889 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee996747-dcce-47bb-b6b1-493c9e40f6c5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ee996747-dcce-47bb-b6b1-493c9e40f6c5" (UID: "ee996747-dcce-47bb-b6b1-493c9e40f6c5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.492831 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-scripts" (OuterVolumeSpecName: "scripts") pod "ee996747-dcce-47bb-b6b1-493c9e40f6c5" (UID: "ee996747-dcce-47bb-b6b1-493c9e40f6c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.495575 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee996747-dcce-47bb-b6b1-493c9e40f6c5-kube-api-access-9hd6d" (OuterVolumeSpecName: "kube-api-access-9hd6d") pod "ee996747-dcce-47bb-b6b1-493c9e40f6c5" (UID: "ee996747-dcce-47bb-b6b1-493c9e40f6c5"). InnerVolumeSpecName "kube-api-access-9hd6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.533482 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ee996747-dcce-47bb-b6b1-493c9e40f6c5" (UID: "ee996747-dcce-47bb-b6b1-493c9e40f6c5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.581191 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ee996747-dcce-47bb-b6b1-493c9e40f6c5" (UID: "ee996747-dcce-47bb-b6b1-493c9e40f6c5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.582287 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee996747-dcce-47bb-b6b1-493c9e40f6c5" (UID: "ee996747-dcce-47bb-b6b1-493c9e40f6c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.590949 4708 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.590977 4708 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee996747-dcce-47bb-b6b1-493c9e40f6c5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.590987 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hd6d\" (UniqueName: \"kubernetes.io/projected/ee996747-dcce-47bb-b6b1-493c9e40f6c5-kube-api-access-9hd6d\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.590997 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.591005 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.591013 4708 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.591021 4708 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee996747-dcce-47bb-b6b1-493c9e40f6c5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.610959 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-config-data" (OuterVolumeSpecName: "config-data") pod "ee996747-dcce-47bb-b6b1-493c9e40f6c5" (UID: "ee996747-dcce-47bb-b6b1-493c9e40f6c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:29 crc kubenswrapper[4708]: I0320 16:24:29.692859 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee996747-dcce-47bb-b6b1-493c9e40f6c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.125158 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.125871 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.128172 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee996747-dcce-47bb-b6b1-493c9e40f6c5","Type":"ContainerDied","Data":"0751d713423c619f1c3d9bd9a65e64daebfb9ce3e948ff52a895927a0ba22762"} Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.128235 4708 scope.go:117] "RemoveContainer" containerID="5efce386f39c4466c39cabf75ea11a1c87c076ebfb2f3669506005d0793f0549" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.168643 4708 scope.go:117] "RemoveContainer" containerID="b690b434a7e465dbdbf01c10fa084459978167ea296a6070634977202fb98b8b" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.172244 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.187488 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.205011 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7p9zm"] Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.205021 4708 scope.go:117] "RemoveContainer" containerID="52974e68528b934c0657c4a394b653aa2303c686bfce44cc58c61617a0240c9e" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.221761 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7p9zm"] Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.233024 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:30 crc kubenswrapper[4708]: E0320 16:24:30.233595 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="ceilometer-central-agent" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.233622 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="ceilometer-central-agent" Mar 20 16:24:30 crc kubenswrapper[4708]: E0320 16:24:30.233632 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="ceilometer-notification-agent" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.233640 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="ceilometer-notification-agent" Mar 20 16:24:30 crc kubenswrapper[4708]: E0320 16:24:30.233658 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8feabe-d295-4e0a-b541-9bfd25628128" containerName="init" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.233683 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8feabe-d295-4e0a-b541-9bfd25628128" containerName="init" Mar 20 16:24:30 crc kubenswrapper[4708]: E0320 16:24:30.233691 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="sg-core" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.233698 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="sg-core" Mar 20 16:24:30 crc kubenswrapper[4708]: E0320 16:24:30.233726 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="proxy-httpd" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.233732 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="proxy-httpd" Mar 20 16:24:30 crc kubenswrapper[4708]: E0320 16:24:30.233742 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8feabe-d295-4e0a-b541-9bfd25628128" containerName="dnsmasq-dns" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.233747 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8feabe-d295-4e0a-b541-9bfd25628128" containerName="dnsmasq-dns" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.233942 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="ceilometer-notification-agent" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.233955 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="sg-core" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.233965 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="proxy-httpd" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.233980 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" containerName="ceilometer-central-agent" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.233992 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc8feabe-d295-4e0a-b541-9bfd25628128" containerName="dnsmasq-dns" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.242547 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.246041 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.246245 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.246398 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.250098 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.258565 4708 scope.go:117] "RemoveContainer" containerID="d8b6e05de4156428cf88ae6c653377dd1c342b5a90c7bbe7998f694d60c7517b" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.310785 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.310848 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f87093e-f826-4a63-b47f-60c6fba18500-log-httpd\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.310903 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.310937 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7ms\" (UniqueName: \"kubernetes.io/projected/2f87093e-f826-4a63-b47f-60c6fba18500-kube-api-access-pq7ms\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.310960 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.310995 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-scripts\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.311012 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-config-data\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.311029 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f87093e-f826-4a63-b47f-60c6fba18500-run-httpd\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.412834 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f87093e-f826-4a63-b47f-60c6fba18500-log-httpd\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.412903 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.412940 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7ms\" (UniqueName: \"kubernetes.io/projected/2f87093e-f826-4a63-b47f-60c6fba18500-kube-api-access-pq7ms\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.412957 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.412984 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-scripts\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.413006 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-config-data\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.413027 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f87093e-f826-4a63-b47f-60c6fba18500-run-httpd\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.413103 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.413445 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f87093e-f826-4a63-b47f-60c6fba18500-log-httpd\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.414100 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f87093e-f826-4a63-b47f-60c6fba18500-run-httpd\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.417333 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-scripts\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.417334 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.418159 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.418264 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.426606 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f87093e-f826-4a63-b47f-60c6fba18500-config-data\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.430621 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7ms\" (UniqueName: \"kubernetes.io/projected/2f87093e-f826-4a63-b47f-60c6fba18500-kube-api-access-pq7ms\") pod \"ceilometer-0\" (UID: \"2f87093e-f826-4a63-b47f-60c6fba18500\") " pod="openstack/ceilometer-0" Mar 20 16:24:30 crc kubenswrapper[4708]: I0320 16:24:30.572658 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 16:24:31 crc kubenswrapper[4708]: I0320 16:24:31.076226 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 16:24:31 crc kubenswrapper[4708]: W0320 16:24:31.087835 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f87093e_f826_4a63_b47f_60c6fba18500.slice/crio-a087e46e60beaf5b22b31f67919dbfb4f5cd8a9864cee1ac365f3d3bb02181db WatchSource:0}: Error finding container a087e46e60beaf5b22b31f67919dbfb4f5cd8a9864cee1ac365f3d3bb02181db: Status 404 returned error can't find the container with id a087e46e60beaf5b22b31f67919dbfb4f5cd8a9864cee1ac365f3d3bb02181db Mar 20 16:24:31 crc kubenswrapper[4708]: I0320 16:24:31.142533 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f87093e-f826-4a63-b47f-60c6fba18500","Type":"ContainerStarted","Data":"a087e46e60beaf5b22b31f67919dbfb4f5cd8a9864cee1ac365f3d3bb02181db"} Mar 20 16:24:32 crc kubenswrapper[4708]: I0320 16:24:32.142114 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee996747-dcce-47bb-b6b1-493c9e40f6c5" path="/var/lib/kubelet/pods/ee996747-dcce-47bb-b6b1-493c9e40f6c5/volumes" Mar 20 16:24:32 crc kubenswrapper[4708]: I0320 16:24:32.144193 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc8feabe-d295-4e0a-b541-9bfd25628128" path="/var/lib/kubelet/pods/fc8feabe-d295-4e0a-b541-9bfd25628128/volumes" Mar 20 16:24:32 crc kubenswrapper[4708]: I0320 16:24:32.194045 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f87093e-f826-4a63-b47f-60c6fba18500","Type":"ContainerStarted","Data":"1684ce77e87785d7c3a562ccb78747b780874d9428dfdcebba2f7c21abc2efd5"} Mar 20 16:24:33 crc kubenswrapper[4708]: I0320 16:24:33.204575 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f87093e-f826-4a63-b47f-60c6fba18500","Type":"ContainerStarted","Data":"224e12ff238cf3b3bac28c0a98131deab42dee925a6a50425914f00208018fc5"} Mar 20 16:24:33 crc kubenswrapper[4708]: I0320 16:24:33.204845 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f87093e-f826-4a63-b47f-60c6fba18500","Type":"ContainerStarted","Data":"6f4fb787b24223ad7ef919c849eb6a78b9060e6d2e0ab35c66634135a4a90851"} Mar 20 16:24:33 crc kubenswrapper[4708]: I0320 16:24:33.206585 4708 generic.go:334] "Generic (PLEG): container finished" podID="243ed677-1e35-40c7-ba55-a90eb9c7b85c" containerID="9fcc1914f821a4ddf6e7373578c1305cfa3ce59e3c6aa9499bf5893c942a03dc" exitCode=0 Mar 20 16:24:33 crc kubenswrapper[4708]: I0320 16:24:33.206608 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nktb8" event={"ID":"243ed677-1e35-40c7-ba55-a90eb9c7b85c","Type":"ContainerDied","Data":"9fcc1914f821a4ddf6e7373578c1305cfa3ce59e3c6aa9499bf5893c942a03dc"} Mar 20 16:24:33 crc kubenswrapper[4708]: I0320 16:24:33.227384 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 16:24:33 crc kubenswrapper[4708]: I0320 16:24:33.227437 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 16:24:33 crc kubenswrapper[4708]: I0320 16:24:33.761786 4708 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-757b4f8459-7p9zm" podUID="fc8feabe-d295-4e0a-b541-9bfd25628128" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.194:5353: i/o timeout" Mar 20 16:24:34 crc kubenswrapper[4708]: E0320 16:24:34.133517 4708 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf676e7f_5129_436c_9451_8a9b1c8c19c0.slice/crio-84915d182a57c84c0d30885753a625a1d23becd15174c9ff41a9700696654d79\": RecentStats: unable to find data in memory cache]" Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.338279 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.338350 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.645058 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.800525 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-config-data\") pod \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.800711 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-scripts\") pod \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.800800 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-combined-ca-bundle\") pod \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.800868 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nx8v\" (UniqueName: \"kubernetes.io/projected/243ed677-1e35-40c7-ba55-a90eb9c7b85c-kube-api-access-6nx8v\") pod \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\" (UID: \"243ed677-1e35-40c7-ba55-a90eb9c7b85c\") " Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.806307 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-scripts" (OuterVolumeSpecName: "scripts") pod "243ed677-1e35-40c7-ba55-a90eb9c7b85c" (UID: "243ed677-1e35-40c7-ba55-a90eb9c7b85c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.808008 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/243ed677-1e35-40c7-ba55-a90eb9c7b85c-kube-api-access-6nx8v" (OuterVolumeSpecName: "kube-api-access-6nx8v") pod "243ed677-1e35-40c7-ba55-a90eb9c7b85c" (UID: "243ed677-1e35-40c7-ba55-a90eb9c7b85c"). InnerVolumeSpecName "kube-api-access-6nx8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.831539 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-config-data" (OuterVolumeSpecName: "config-data") pod "243ed677-1e35-40c7-ba55-a90eb9c7b85c" (UID: "243ed677-1e35-40c7-ba55-a90eb9c7b85c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.842943 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "243ed677-1e35-40c7-ba55-a90eb9c7b85c" (UID: "243ed677-1e35-40c7-ba55-a90eb9c7b85c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.903754 4708 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.903788 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.903828 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nx8v\" (UniqueName: \"kubernetes.io/projected/243ed677-1e35-40c7-ba55-a90eb9c7b85c-kube-api-access-6nx8v\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:34 crc kubenswrapper[4708]: I0320 16:24:34.903838 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/243ed677-1e35-40c7-ba55-a90eb9c7b85c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.227100 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nktb8" Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.227253 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nktb8" event={"ID":"243ed677-1e35-40c7-ba55-a90eb9c7b85c","Type":"ContainerDied","Data":"ba245bb0ec1b5d2bc4f341c8716077b5b4b776d917f91324615aaf3326023ec2"} Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.227686 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba245bb0ec1b5d2bc4f341c8716077b5b4b776d917f91324615aaf3326023ec2" Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.230493 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f87093e-f826-4a63-b47f-60c6fba18500","Type":"ContainerStarted","Data":"582b5037ce42df9d4b632d59f4a2dc5d8faf35d00fe8a405589c5d0eace80f61"} Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.231940 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.242066 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.252067 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.253243 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.275203 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.482549589 podStartE2EDuration="5.275182448s" podCreationTimestamp="2026-03-20 16:24:30 +0000 UTC" firstStartedPulling="2026-03-20 16:24:31.089857608 +0000 UTC m=+1425.764194323" lastFinishedPulling="2026-03-20 16:24:34.882490467 +0000 UTC m=+1429.556827182" observedRunningTime="2026-03-20 16:24:35.26497279 +0000 UTC m=+1429.939309525" watchObservedRunningTime="2026-03-20 16:24:35.275182448 +0000 UTC m=+1429.949519163" Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.351953 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22397977-15ae-4789-8a19-613b16c79fb6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.351989 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22397977-15ae-4789-8a19-613b16c79fb6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.379380 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.379833 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="22397977-15ae-4789-8a19-613b16c79fb6" containerName="nova-api-log" containerID="cri-o://229e8472d38db4b9e53d22a48ee4fd5606a52ed4c5b741cfc1a769a9e9dd728b" gracePeriod=30 Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.379926 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="22397977-15ae-4789-8a19-613b16c79fb6" containerName="nova-api-api" containerID="cri-o://f5d844b29fa2ec17ac8e9eb25fea41a9b99ee25895f99a4f9891026e38e9cf75" gracePeriod=30 Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.405896 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.406205 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bb6ea07b-75ea-4861-866a-361b91e1c277" containerName="nova-scheduler-scheduler" containerID="cri-o://eb25e4e247261583daa1a69711542c6735a3453bd1ba79928a55fd1b90a9063d" gracePeriod=30 Mar 20 16:24:35 crc kubenswrapper[4708]: I0320 16:24:35.416427 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:35 crc kubenswrapper[4708]: E0320 16:24:35.897197 4708 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eb25e4e247261583daa1a69711542c6735a3453bd1ba79928a55fd1b90a9063d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 16:24:35 crc kubenswrapper[4708]: E0320 16:24:35.902057 4708 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eb25e4e247261583daa1a69711542c6735a3453bd1ba79928a55fd1b90a9063d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 16:24:35 crc kubenswrapper[4708]: E0320 16:24:35.906188 4708 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eb25e4e247261583daa1a69711542c6735a3453bd1ba79928a55fd1b90a9063d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 16:24:35 crc kubenswrapper[4708]: E0320 16:24:35.906282 4708 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bb6ea07b-75ea-4861-866a-361b91e1c277" containerName="nova-scheduler-scheduler" Mar 20 16:24:36 crc kubenswrapper[4708]: I0320 16:24:36.243987 4708 generic.go:334] "Generic (PLEG): container finished" podID="22397977-15ae-4789-8a19-613b16c79fb6" containerID="229e8472d38db4b9e53d22a48ee4fd5606a52ed4c5b741cfc1a769a9e9dd728b" exitCode=143 Mar 20 16:24:36 crc kubenswrapper[4708]: I0320 16:24:36.244079 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22397977-15ae-4789-8a19-613b16c79fb6","Type":"ContainerDied","Data":"229e8472d38db4b9e53d22a48ee4fd5606a52ed4c5b741cfc1a769a9e9dd728b"} Mar 20 16:24:36 crc kubenswrapper[4708]: I0320 16:24:36.254879 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 16:24:37 crc kubenswrapper[4708]: I0320 16:24:37.262274 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2659505d-fddd-42b2-a820-21fd1bac6479" containerName="nova-metadata-log" containerID="cri-o://ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2" gracePeriod=30 Mar 20 16:24:37 crc kubenswrapper[4708]: I0320 16:24:37.262536 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2659505d-fddd-42b2-a820-21fd1bac6479" containerName="nova-metadata-metadata" containerID="cri-o://392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def" gracePeriod=30 Mar 20 16:24:38 crc kubenswrapper[4708]: I0320 16:24:38.277858 4708 generic.go:334] "Generic (PLEG): container finished" podID="2659505d-fddd-42b2-a820-21fd1bac6479" containerID="ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2" exitCode=143 Mar 20 16:24:38 crc kubenswrapper[4708]: I0320 16:24:38.277944 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2659505d-fddd-42b2-a820-21fd1bac6479","Type":"ContainerDied","Data":"ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2"} Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.297601 4708 generic.go:334] "Generic (PLEG): container finished" podID="bb6ea07b-75ea-4861-866a-361b91e1c277" containerID="eb25e4e247261583daa1a69711542c6735a3453bd1ba79928a55fd1b90a9063d" exitCode=0 Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.298099 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bb6ea07b-75ea-4861-866a-361b91e1c277","Type":"ContainerDied","Data":"eb25e4e247261583daa1a69711542c6735a3453bd1ba79928a55fd1b90a9063d"} Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.298130 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bb6ea07b-75ea-4861-866a-361b91e1c277","Type":"ContainerDied","Data":"6588566776bb504c233ee88e41783c20bc35256ae5e1c71e250ebd8bbf9ea7cc"} Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.298141 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6588566776bb504c233ee88e41783c20bc35256ae5e1c71e250ebd8bbf9ea7cc" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.316912 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.421442 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcfbx\" (UniqueName: \"kubernetes.io/projected/bb6ea07b-75ea-4861-866a-361b91e1c277-kube-api-access-gcfbx\") pod \"bb6ea07b-75ea-4861-866a-361b91e1c277\" (UID: \"bb6ea07b-75ea-4861-866a-361b91e1c277\") " Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.422351 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6ea07b-75ea-4861-866a-361b91e1c277-combined-ca-bundle\") pod \"bb6ea07b-75ea-4861-866a-361b91e1c277\" (UID: \"bb6ea07b-75ea-4861-866a-361b91e1c277\") " Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.422459 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6ea07b-75ea-4861-866a-361b91e1c277-config-data\") pod \"bb6ea07b-75ea-4861-866a-361b91e1c277\" (UID: \"bb6ea07b-75ea-4861-866a-361b91e1c277\") " Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.441024 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6ea07b-75ea-4861-866a-361b91e1c277-kube-api-access-gcfbx" (OuterVolumeSpecName: "kube-api-access-gcfbx") pod "bb6ea07b-75ea-4861-866a-361b91e1c277" (UID: "bb6ea07b-75ea-4861-866a-361b91e1c277"). InnerVolumeSpecName "kube-api-access-gcfbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.450842 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6ea07b-75ea-4861-866a-361b91e1c277-config-data" (OuterVolumeSpecName: "config-data") pod "bb6ea07b-75ea-4861-866a-361b91e1c277" (UID: "bb6ea07b-75ea-4861-866a-361b91e1c277"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.463396 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6ea07b-75ea-4861-866a-361b91e1c277-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb6ea07b-75ea-4861-866a-361b91e1c277" (UID: "bb6ea07b-75ea-4861-866a-361b91e1c277"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.525204 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcfbx\" (UniqueName: \"kubernetes.io/projected/bb6ea07b-75ea-4861-866a-361b91e1c277-kube-api-access-gcfbx\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.525243 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6ea07b-75ea-4861-866a-361b91e1c277-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.525253 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6ea07b-75ea-4861-866a-361b91e1c277-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.724315 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.829723 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-config-data\") pod \"2659505d-fddd-42b2-a820-21fd1bac6479\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.829833 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-nova-metadata-tls-certs\") pod \"2659505d-fddd-42b2-a820-21fd1bac6479\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.829873 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2659505d-fddd-42b2-a820-21fd1bac6479-logs\") pod \"2659505d-fddd-42b2-a820-21fd1bac6479\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.829910 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-combined-ca-bundle\") pod \"2659505d-fddd-42b2-a820-21fd1bac6479\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.829946 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtr5k\" (UniqueName: \"kubernetes.io/projected/2659505d-fddd-42b2-a820-21fd1bac6479-kube-api-access-jtr5k\") pod \"2659505d-fddd-42b2-a820-21fd1bac6479\" (UID: \"2659505d-fddd-42b2-a820-21fd1bac6479\") " Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.831278 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2659505d-fddd-42b2-a820-21fd1bac6479-logs" (OuterVolumeSpecName: "logs") pod "2659505d-fddd-42b2-a820-21fd1bac6479" (UID: "2659505d-fddd-42b2-a820-21fd1bac6479"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.834944 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2659505d-fddd-42b2-a820-21fd1bac6479-kube-api-access-jtr5k" (OuterVolumeSpecName: "kube-api-access-jtr5k") pod "2659505d-fddd-42b2-a820-21fd1bac6479" (UID: "2659505d-fddd-42b2-a820-21fd1bac6479"). InnerVolumeSpecName "kube-api-access-jtr5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.859581 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2659505d-fddd-42b2-a820-21fd1bac6479" (UID: "2659505d-fddd-42b2-a820-21fd1bac6479"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.860252 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-config-data" (OuterVolumeSpecName: "config-data") pod "2659505d-fddd-42b2-a820-21fd1bac6479" (UID: "2659505d-fddd-42b2-a820-21fd1bac6479"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.901867 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2659505d-fddd-42b2-a820-21fd1bac6479" (UID: "2659505d-fddd-42b2-a820-21fd1bac6479"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.936908 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.936951 4708 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.936964 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2659505d-fddd-42b2-a820-21fd1bac6479-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.936976 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2659505d-fddd-42b2-a820-21fd1bac6479-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:40 crc kubenswrapper[4708]: I0320 16:24:40.936987 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtr5k\" (UniqueName: \"kubernetes.io/projected/2659505d-fddd-42b2-a820-21fd1bac6479-kube-api-access-jtr5k\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.310594 4708 generic.go:334] "Generic (PLEG): container finished" podID="2659505d-fddd-42b2-a820-21fd1bac6479" containerID="392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def" exitCode=0 Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.310685 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.310660 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2659505d-fddd-42b2-a820-21fd1bac6479","Type":"ContainerDied","Data":"392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def"} Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.310844 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2659505d-fddd-42b2-a820-21fd1bac6479","Type":"ContainerDied","Data":"79abddf9ba44fe01b559f702ef729ac7e79b32918d88e269286f3abf174ba9a4"} Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.310889 4708 scope.go:117] "RemoveContainer" containerID="392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.312807 4708 generic.go:334] "Generic (PLEG): container finished" podID="22397977-15ae-4789-8a19-613b16c79fb6" containerID="f5d844b29fa2ec17ac8e9eb25fea41a9b99ee25895f99a4f9891026e38e9cf75" exitCode=0 Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.312879 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22397977-15ae-4789-8a19-613b16c79fb6","Type":"ContainerDied","Data":"f5d844b29fa2ec17ac8e9eb25fea41a9b99ee25895f99a4f9891026e38e9cf75"} Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.312892 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.312915 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22397977-15ae-4789-8a19-613b16c79fb6","Type":"ContainerDied","Data":"073a669aa4d6a2243ec33d57eb3da36cde2aeee5dbb8f17b4a03050dbe496838"} Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.312931 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="073a669aa4d6a2243ec33d57eb3da36cde2aeee5dbb8f17b4a03050dbe496838" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.336224 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.347101 4708 scope.go:117] "RemoveContainer" containerID="ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.353853 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.371121 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.379439 4708 scope.go:117] "RemoveContainer" containerID="392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.379792 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:41 crc kubenswrapper[4708]: E0320 16:24:41.379992 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def\": container with ID starting with 392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def not found: ID does not exist" containerID="392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380090 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def"} err="failed to get container status \"392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def\": rpc error: code = NotFound desc = could not find container \"392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def\": container with ID starting with 392251b2670fc5305a4ddd07075f0c9f45e802db5866618bc6bf0a7bd9a38def not found: ID does not exist" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380174 4708 scope.go:117] "RemoveContainer" containerID="ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2" Mar 20 16:24:41 crc kubenswrapper[4708]: E0320 16:24:41.380219 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22397977-15ae-4789-8a19-613b16c79fb6" containerName="nova-api-log" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380331 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="22397977-15ae-4789-8a19-613b16c79fb6" containerName="nova-api-log" Mar 20 16:24:41 crc kubenswrapper[4708]: E0320 16:24:41.380347 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2659505d-fddd-42b2-a820-21fd1bac6479" containerName="nova-metadata-log" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380354 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="2659505d-fddd-42b2-a820-21fd1bac6479" containerName="nova-metadata-log" Mar 20 16:24:41 crc kubenswrapper[4708]: E0320 16:24:41.380381 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6ea07b-75ea-4861-866a-361b91e1c277" containerName="nova-scheduler-scheduler" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380429 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6ea07b-75ea-4861-866a-361b91e1c277" containerName="nova-scheduler-scheduler" Mar 20 16:24:41 crc kubenswrapper[4708]: E0320 16:24:41.380443 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2659505d-fddd-42b2-a820-21fd1bac6479" containerName="nova-metadata-metadata" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380449 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="2659505d-fddd-42b2-a820-21fd1bac6479" containerName="nova-metadata-metadata" Mar 20 16:24:41 crc kubenswrapper[4708]: E0320 16:24:41.380462 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243ed677-1e35-40c7-ba55-a90eb9c7b85c" containerName="nova-manage" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380468 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="243ed677-1e35-40c7-ba55-a90eb9c7b85c" containerName="nova-manage" Mar 20 16:24:41 crc kubenswrapper[4708]: E0320 16:24:41.380482 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22397977-15ae-4789-8a19-613b16c79fb6" containerName="nova-api-api" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380488 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="22397977-15ae-4789-8a19-613b16c79fb6" containerName="nova-api-api" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380711 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="22397977-15ae-4789-8a19-613b16c79fb6" containerName="nova-api-api" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380727 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6ea07b-75ea-4861-866a-361b91e1c277" containerName="nova-scheduler-scheduler" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380739 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="2659505d-fddd-42b2-a820-21fd1bac6479" containerName="nova-metadata-log" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380759 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="243ed677-1e35-40c7-ba55-a90eb9c7b85c" containerName="nova-manage" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380772 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="22397977-15ae-4789-8a19-613b16c79fb6" containerName="nova-api-log" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.380782 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="2659505d-fddd-42b2-a820-21fd1bac6479" containerName="nova-metadata-metadata" Mar 20 16:24:41 crc kubenswrapper[4708]: E0320 16:24:41.381100 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2\": container with ID starting with ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2 not found: ID does not exist" containerID="ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.381171 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2"} err="failed to get container status \"ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2\": rpc error: code = NotFound desc = could not find container \"ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2\": container with ID starting with ba64494662bfe560d8216f45e71dbfa5c2e668f6563545ad33470bfa91ff7ec2 not found: ID does not exist" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.381420 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.384756 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.390758 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.438477 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.448561 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.456830 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-config-data\") pod \"22397977-15ae-4789-8a19-613b16c79fb6\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.456893 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-public-tls-certs\") pod \"22397977-15ae-4789-8a19-613b16c79fb6\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.456932 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-internal-tls-certs\") pod \"22397977-15ae-4789-8a19-613b16c79fb6\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.457074 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22397977-15ae-4789-8a19-613b16c79fb6-logs\") pod \"22397977-15ae-4789-8a19-613b16c79fb6\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.457196 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-combined-ca-bundle\") pod \"22397977-15ae-4789-8a19-613b16c79fb6\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.457269 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-488nn\" (UniqueName: \"kubernetes.io/projected/22397977-15ae-4789-8a19-613b16c79fb6-kube-api-access-488nn\") pod \"22397977-15ae-4789-8a19-613b16c79fb6\" (UID: \"22397977-15ae-4789-8a19-613b16c79fb6\") " Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.457622 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22397977-15ae-4789-8a19-613b16c79fb6-logs" (OuterVolumeSpecName: "logs") pod "22397977-15ae-4789-8a19-613b16c79fb6" (UID: "22397977-15ae-4789-8a19-613b16c79fb6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.467968 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.468829 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22397977-15ae-4789-8a19-613b16c79fb6-kube-api-access-488nn" (OuterVolumeSpecName: "kube-api-access-488nn") pod "22397977-15ae-4789-8a19-613b16c79fb6" (UID: "22397977-15ae-4789-8a19-613b16c79fb6"). InnerVolumeSpecName "kube-api-access-488nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.474572 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.476708 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.476826 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.477065 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.494428 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22397977-15ae-4789-8a19-613b16c79fb6" (UID: "22397977-15ae-4789-8a19-613b16c79fb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.503437 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-config-data" (OuterVolumeSpecName: "config-data") pod "22397977-15ae-4789-8a19-613b16c79fb6" (UID: "22397977-15ae-4789-8a19-613b16c79fb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.546012 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "22397977-15ae-4789-8a19-613b16c79fb6" (UID: "22397977-15ae-4789-8a19-613b16c79fb6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.546642 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "22397977-15ae-4789-8a19-613b16c79fb6" (UID: "22397977-15ae-4789-8a19-613b16c79fb6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.559545 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6baef84a-7b0b-4a61-9f68-691ccd14f3a6-config-data\") pod \"nova-scheduler-0\" (UID: \"6baef84a-7b0b-4a61-9f68-691ccd14f3a6\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.560037 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm5tz\" (UniqueName: \"kubernetes.io/projected/6baef84a-7b0b-4a61-9f68-691ccd14f3a6-kube-api-access-cm5tz\") pod \"nova-scheduler-0\" (UID: \"6baef84a-7b0b-4a61-9f68-691ccd14f3a6\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.560245 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6baef84a-7b0b-4a61-9f68-691ccd14f3a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6baef84a-7b0b-4a61-9f68-691ccd14f3a6\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.560423 4708 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.560488 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-488nn\" (UniqueName: \"kubernetes.io/projected/22397977-15ae-4789-8a19-613b16c79fb6-kube-api-access-488nn\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.560542 4708 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.560596 4708 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.560660 4708 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22397977-15ae-4789-8a19-613b16c79fb6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.560799 4708 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22397977-15ae-4789-8a19-613b16c79fb6-logs\") on node \"crc\" DevicePath \"\"" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.662703 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6baef84a-7b0b-4a61-9f68-691ccd14f3a6-config-data\") pod \"nova-scheduler-0\" (UID: \"6baef84a-7b0b-4a61-9f68-691ccd14f3a6\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.663290 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hrbd\" (UniqueName: \"kubernetes.io/projected/30d5915f-5d5c-4763-a550-0c1ac74086a2-kube-api-access-9hrbd\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.663320 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/30d5915f-5d5c-4763-a550-0c1ac74086a2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.663436 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30d5915f-5d5c-4763-a550-0c1ac74086a2-logs\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.663547 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm5tz\" (UniqueName: \"kubernetes.io/projected/6baef84a-7b0b-4a61-9f68-691ccd14f3a6-kube-api-access-cm5tz\") pod \"nova-scheduler-0\" (UID: \"6baef84a-7b0b-4a61-9f68-691ccd14f3a6\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.663629 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d5915f-5d5c-4763-a550-0c1ac74086a2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.663689 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d5915f-5d5c-4763-a550-0c1ac74086a2-config-data\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.663732 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6baef84a-7b0b-4a61-9f68-691ccd14f3a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6baef84a-7b0b-4a61-9f68-691ccd14f3a6\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.666413 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6baef84a-7b0b-4a61-9f68-691ccd14f3a6-config-data\") pod \"nova-scheduler-0\" (UID: \"6baef84a-7b0b-4a61-9f68-691ccd14f3a6\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.667098 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6baef84a-7b0b-4a61-9f68-691ccd14f3a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6baef84a-7b0b-4a61-9f68-691ccd14f3a6\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.680013 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm5tz\" (UniqueName: \"kubernetes.io/projected/6baef84a-7b0b-4a61-9f68-691ccd14f3a6-kube-api-access-cm5tz\") pod \"nova-scheduler-0\" (UID: \"6baef84a-7b0b-4a61-9f68-691ccd14f3a6\") " pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.696133 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.765209 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hrbd\" (UniqueName: \"kubernetes.io/projected/30d5915f-5d5c-4763-a550-0c1ac74086a2-kube-api-access-9hrbd\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.765260 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/30d5915f-5d5c-4763-a550-0c1ac74086a2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.765281 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30d5915f-5d5c-4763-a550-0c1ac74086a2-logs\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.765328 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d5915f-5d5c-4763-a550-0c1ac74086a2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.765350 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d5915f-5d5c-4763-a550-0c1ac74086a2-config-data\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.766077 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30d5915f-5d5c-4763-a550-0c1ac74086a2-logs\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.768880 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d5915f-5d5c-4763-a550-0c1ac74086a2-config-data\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.784609 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d5915f-5d5c-4763-a550-0c1ac74086a2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.789092 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/30d5915f-5d5c-4763-a550-0c1ac74086a2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.794047 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hrbd\" (UniqueName: \"kubernetes.io/projected/30d5915f-5d5c-4763-a550-0c1ac74086a2-kube-api-access-9hrbd\") pod \"nova-metadata-0\" (UID: \"30d5915f-5d5c-4763-a550-0c1ac74086a2\") " pod="openstack/nova-metadata-0" Mar 20 16:24:41 crc kubenswrapper[4708]: I0320 16:24:41.888216 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.134604 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2659505d-fddd-42b2-a820-21fd1bac6479" path="/var/lib/kubelet/pods/2659505d-fddd-42b2-a820-21fd1bac6479/volumes" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.139195 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6ea07b-75ea-4861-866a-361b91e1c277" path="/var/lib/kubelet/pods/bb6ea07b-75ea-4861-866a-361b91e1c277/volumes" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.179257 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.321858 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6baef84a-7b0b-4a61-9f68-691ccd14f3a6","Type":"ContainerStarted","Data":"b4e1754e1ed18bd07b9170ad41ddb418ea23c993812dd19924b7d035bad240c6"} Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.322978 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.337506 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.351724 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:42 crc kubenswrapper[4708]: W0320 16:24:42.353945 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30d5915f_5d5c_4763_a550_0c1ac74086a2.slice/crio-912559a9344f2ac3291f5e4c4c59d502777b130d64080b1c052f62e63cbb47d0 WatchSource:0}: Error finding container 912559a9344f2ac3291f5e4c4c59d502777b130d64080b1c052f62e63cbb47d0: Status 404 returned error can't find the container with id 912559a9344f2ac3291f5e4c4c59d502777b130d64080b1c052f62e63cbb47d0 Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.359737 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.388591 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.402962 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.407632 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.407714 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.408589 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.419230 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.587076 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24d52f23-40eb-4b2f-8c6d-83b887450123-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.587153 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24d52f23-40eb-4b2f-8c6d-83b887450123-config-data\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.587216 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d52f23-40eb-4b2f-8c6d-83b887450123-internal-tls-certs\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.587279 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg8zl\" (UniqueName: \"kubernetes.io/projected/24d52f23-40eb-4b2f-8c6d-83b887450123-kube-api-access-jg8zl\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.587334 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d52f23-40eb-4b2f-8c6d-83b887450123-public-tls-certs\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.587358 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24d52f23-40eb-4b2f-8c6d-83b887450123-logs\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.688926 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24d52f23-40eb-4b2f-8c6d-83b887450123-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.688988 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24d52f23-40eb-4b2f-8c6d-83b887450123-config-data\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.689014 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d52f23-40eb-4b2f-8c6d-83b887450123-internal-tls-certs\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.689041 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg8zl\" (UniqueName: \"kubernetes.io/projected/24d52f23-40eb-4b2f-8c6d-83b887450123-kube-api-access-jg8zl\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.689080 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d52f23-40eb-4b2f-8c6d-83b887450123-public-tls-certs\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.689095 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24d52f23-40eb-4b2f-8c6d-83b887450123-logs\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.689479 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24d52f23-40eb-4b2f-8c6d-83b887450123-logs\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.693046 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24d52f23-40eb-4b2f-8c6d-83b887450123-config-data\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.693228 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d52f23-40eb-4b2f-8c6d-83b887450123-internal-tls-certs\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.703918 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d52f23-40eb-4b2f-8c6d-83b887450123-public-tls-certs\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.704149 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24d52f23-40eb-4b2f-8c6d-83b887450123-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.708577 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg8zl\" (UniqueName: \"kubernetes.io/projected/24d52f23-40eb-4b2f-8c6d-83b887450123-kube-api-access-jg8zl\") pod \"nova-api-0\" (UID: \"24d52f23-40eb-4b2f-8c6d-83b887450123\") " pod="openstack/nova-api-0" Mar 20 16:24:42 crc kubenswrapper[4708]: I0320 16:24:42.731882 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 16:24:43 crc kubenswrapper[4708]: I0320 16:24:43.193306 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 16:24:43 crc kubenswrapper[4708]: W0320 16:24:43.205814 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24d52f23_40eb_4b2f_8c6d_83b887450123.slice/crio-7555547ccd65cffe4da500e73fd9d634734a6ad2fc32a265b7bcd03ce487170d WatchSource:0}: Error finding container 7555547ccd65cffe4da500e73fd9d634734a6ad2fc32a265b7bcd03ce487170d: Status 404 returned error can't find the container with id 7555547ccd65cffe4da500e73fd9d634734a6ad2fc32a265b7bcd03ce487170d Mar 20 16:24:43 crc kubenswrapper[4708]: I0320 16:24:43.334207 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"30d5915f-5d5c-4763-a550-0c1ac74086a2","Type":"ContainerStarted","Data":"66539939eee6f4e23dff2dca6e39374661031fcad3f8a4a5174331fa754fbe3f"} Mar 20 16:24:43 crc kubenswrapper[4708]: I0320 16:24:43.334705 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"30d5915f-5d5c-4763-a550-0c1ac74086a2","Type":"ContainerStarted","Data":"c6dfc400982713dfdf61776aad28f3f9898e1b6075ca0746c2cfde6821368c3a"} Mar 20 16:24:43 crc kubenswrapper[4708]: I0320 16:24:43.334729 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"30d5915f-5d5c-4763-a550-0c1ac74086a2","Type":"ContainerStarted","Data":"912559a9344f2ac3291f5e4c4c59d502777b130d64080b1c052f62e63cbb47d0"} Mar 20 16:24:43 crc kubenswrapper[4708]: I0320 16:24:43.337029 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24d52f23-40eb-4b2f-8c6d-83b887450123","Type":"ContainerStarted","Data":"7555547ccd65cffe4da500e73fd9d634734a6ad2fc32a265b7bcd03ce487170d"} Mar 20 16:24:43 crc kubenswrapper[4708]: I0320 16:24:43.338504 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6baef84a-7b0b-4a61-9f68-691ccd14f3a6","Type":"ContainerStarted","Data":"4c470fae8943491bb3ef4128f61a4622ef6510dce00c2dc535c75b52508a3ee4"} Mar 20 16:24:43 crc kubenswrapper[4708]: I0320 16:24:43.366134 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.366108107 podStartE2EDuration="2.366108107s" podCreationTimestamp="2026-03-20 16:24:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:43.357317299 +0000 UTC m=+1438.031654024" watchObservedRunningTime="2026-03-20 16:24:43.366108107 +0000 UTC m=+1438.040444822" Mar 20 16:24:43 crc kubenswrapper[4708]: I0320 16:24:43.384179 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.384161248 podStartE2EDuration="2.384161248s" podCreationTimestamp="2026-03-20 16:24:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:43.37835828 +0000 UTC m=+1438.052694995" watchObservedRunningTime="2026-03-20 16:24:43.384161248 +0000 UTC m=+1438.058497963" Mar 20 16:24:44 crc kubenswrapper[4708]: I0320 16:24:44.123455 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22397977-15ae-4789-8a19-613b16c79fb6" path="/var/lib/kubelet/pods/22397977-15ae-4789-8a19-613b16c79fb6/volumes" Mar 20 16:24:44 crc kubenswrapper[4708]: I0320 16:24:44.351194 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24d52f23-40eb-4b2f-8c6d-83b887450123","Type":"ContainerStarted","Data":"301618d8249246f73c1df96e943f814399b8d7920f3a20c5c8fd964ed5f3541b"} Mar 20 16:24:44 crc kubenswrapper[4708]: I0320 16:24:44.351249 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24d52f23-40eb-4b2f-8c6d-83b887450123","Type":"ContainerStarted","Data":"c96d34de52b377b1313d4a609e073c286e45cc760973e62fad9c41863f988d43"} Mar 20 16:24:44 crc kubenswrapper[4708]: E0320 16:24:44.371697 4708 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf676e7f_5129_436c_9451_8a9b1c8c19c0.slice/crio-84915d182a57c84c0d30885753a625a1d23becd15174c9ff41a9700696654d79\": RecentStats: unable to find data in memory cache]" Mar 20 16:24:44 crc kubenswrapper[4708]: I0320 16:24:44.373643 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.373560142 podStartE2EDuration="2.373560142s" podCreationTimestamp="2026-03-20 16:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:24:44.370127878 +0000 UTC m=+1439.044464593" watchObservedRunningTime="2026-03-20 16:24:44.373560142 +0000 UTC m=+1439.047896857" Mar 20 16:24:46 crc kubenswrapper[4708]: I0320 16:24:46.696503 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 16:24:51 crc kubenswrapper[4708]: I0320 16:24:51.697256 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 16:24:51 crc kubenswrapper[4708]: I0320 16:24:51.721921 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 16:24:51 crc kubenswrapper[4708]: I0320 16:24:51.888975 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 16:24:51 crc kubenswrapper[4708]: I0320 16:24:51.889030 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 16:24:52 crc kubenswrapper[4708]: I0320 16:24:52.467736 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 16:24:52 crc kubenswrapper[4708]: I0320 16:24:52.733047 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:24:52 crc kubenswrapper[4708]: I0320 16:24:52.733449 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 16:24:52 crc kubenswrapper[4708]: I0320 16:24:52.904984 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="30d5915f-5d5c-4763-a550-0c1ac74086a2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:52 crc kubenswrapper[4708]: I0320 16:24:52.905192 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="30d5915f-5d5c-4763-a550-0c1ac74086a2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:53 crc kubenswrapper[4708]: I0320 16:24:53.749017 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="24d52f23-40eb-4b2f-8c6d-83b887450123" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:53 crc kubenswrapper[4708]: I0320 16:24:53.749027 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="24d52f23-40eb-4b2f-8c6d-83b887450123" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 16:24:59 crc kubenswrapper[4708]: I0320 16:24:59.888861 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 16:24:59 crc kubenswrapper[4708]: I0320 16:24:59.889393 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 16:25:00 crc kubenswrapper[4708]: I0320 16:25:00.589596 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 16:25:00 crc kubenswrapper[4708]: I0320 16:25:00.732413 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 16:25:00 crc kubenswrapper[4708]: I0320 16:25:00.732824 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 16:25:01 crc kubenswrapper[4708]: I0320 16:25:01.898698 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 16:25:01 crc kubenswrapper[4708]: I0320 16:25:01.899198 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 16:25:01 crc kubenswrapper[4708]: I0320 16:25:01.911140 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 16:25:02 crc kubenswrapper[4708]: I0320 16:25:02.549516 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 16:25:02 crc kubenswrapper[4708]: I0320 16:25:02.742309 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 16:25:02 crc kubenswrapper[4708]: I0320 16:25:02.748518 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 16:25:02 crc kubenswrapper[4708]: I0320 16:25:02.748970 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 16:25:03 crc kubenswrapper[4708]: I0320 16:25:03.566258 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 16:25:42 crc kubenswrapper[4708]: I0320 16:25:42.868802 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g7pjs/must-gather-6mfnj"] Mar 20 16:25:42 crc kubenswrapper[4708]: I0320 16:25:42.872325 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7pjs/must-gather-6mfnj" Mar 20 16:25:42 crc kubenswrapper[4708]: I0320 16:25:42.877393 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-g7pjs"/"openshift-service-ca.crt" Mar 20 16:25:42 crc kubenswrapper[4708]: I0320 16:25:42.878038 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-g7pjs"/"kube-root-ca.crt" Mar 20 16:25:42 crc kubenswrapper[4708]: I0320 16:25:42.883253 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmhdd\" (UniqueName: \"kubernetes.io/projected/ebe8ded9-92d5-473f-a111-9c4fea2091ba-kube-api-access-hmhdd\") pod \"must-gather-6mfnj\" (UID: \"ebe8ded9-92d5-473f-a111-9c4fea2091ba\") " pod="openshift-must-gather-g7pjs/must-gather-6mfnj" Mar 20 16:25:42 crc kubenswrapper[4708]: I0320 16:25:42.883397 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebe8ded9-92d5-473f-a111-9c4fea2091ba-must-gather-output\") pod \"must-gather-6mfnj\" (UID: \"ebe8ded9-92d5-473f-a111-9c4fea2091ba\") " pod="openshift-must-gather-g7pjs/must-gather-6mfnj" Mar 20 16:25:42 crc kubenswrapper[4708]: I0320 16:25:42.894889 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g7pjs/must-gather-6mfnj"] Mar 20 16:25:42 crc kubenswrapper[4708]: I0320 16:25:42.989523 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebe8ded9-92d5-473f-a111-9c4fea2091ba-must-gather-output\") pod \"must-gather-6mfnj\" (UID: \"ebe8ded9-92d5-473f-a111-9c4fea2091ba\") " pod="openshift-must-gather-g7pjs/must-gather-6mfnj" Mar 20 16:25:42 crc kubenswrapper[4708]: I0320 16:25:42.989987 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmhdd\" (UniqueName: \"kubernetes.io/projected/ebe8ded9-92d5-473f-a111-9c4fea2091ba-kube-api-access-hmhdd\") pod \"must-gather-6mfnj\" (UID: \"ebe8ded9-92d5-473f-a111-9c4fea2091ba\") " pod="openshift-must-gather-g7pjs/must-gather-6mfnj" Mar 20 16:25:42 crc kubenswrapper[4708]: I0320 16:25:42.991015 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebe8ded9-92d5-473f-a111-9c4fea2091ba-must-gather-output\") pod \"must-gather-6mfnj\" (UID: \"ebe8ded9-92d5-473f-a111-9c4fea2091ba\") " pod="openshift-must-gather-g7pjs/must-gather-6mfnj" Mar 20 16:25:43 crc kubenswrapper[4708]: I0320 16:25:43.029316 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmhdd\" (UniqueName: \"kubernetes.io/projected/ebe8ded9-92d5-473f-a111-9c4fea2091ba-kube-api-access-hmhdd\") pod \"must-gather-6mfnj\" (UID: \"ebe8ded9-92d5-473f-a111-9c4fea2091ba\") " pod="openshift-must-gather-g7pjs/must-gather-6mfnj" Mar 20 16:25:43 crc kubenswrapper[4708]: I0320 16:25:43.218949 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7pjs/must-gather-6mfnj" Mar 20 16:25:43 crc kubenswrapper[4708]: I0320 16:25:43.713432 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g7pjs/must-gather-6mfnj"] Mar 20 16:25:43 crc kubenswrapper[4708]: I0320 16:25:43.718489 4708 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:25:43 crc kubenswrapper[4708]: I0320 16:25:43.971949 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7pjs/must-gather-6mfnj" event={"ID":"ebe8ded9-92d5-473f-a111-9c4fea2091ba","Type":"ContainerStarted","Data":"1e3fc457258046daa8c91f5013f3ba0cd329a3eef6ccffefa3cbd9750f281710"} Mar 20 16:25:49 crc kubenswrapper[4708]: I0320 16:25:49.035699 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7pjs/must-gather-6mfnj" event={"ID":"ebe8ded9-92d5-473f-a111-9c4fea2091ba","Type":"ContainerStarted","Data":"dd94a813b7cea89cddbd798bd578bdb90dae7786499d40f5e05213cc54c7bfc0"} Mar 20 16:25:49 crc kubenswrapper[4708]: I0320 16:25:49.037170 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7pjs/must-gather-6mfnj" event={"ID":"ebe8ded9-92d5-473f-a111-9c4fea2091ba","Type":"ContainerStarted","Data":"252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542"} Mar 20 16:25:49 crc kubenswrapper[4708]: I0320 16:25:49.098209 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g7pjs/must-gather-6mfnj" podStartSLOduration=2.424946169 podStartE2EDuration="7.098181644s" podCreationTimestamp="2026-03-20 16:25:42 +0000 UTC" firstStartedPulling="2026-03-20 16:25:43.718452637 +0000 UTC m=+1498.392789352" lastFinishedPulling="2026-03-20 16:25:48.391688112 +0000 UTC m=+1503.066024827" observedRunningTime="2026-03-20 16:25:49.091718034 +0000 UTC m=+1503.766054749" watchObservedRunningTime="2026-03-20 16:25:49.098181644 +0000 UTC m=+1503.772518359" Mar 20 16:25:52 crc kubenswrapper[4708]: I0320 16:25:52.826871 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g7pjs/crc-debug-z6858"] Mar 20 16:25:52 crc kubenswrapper[4708]: I0320 16:25:52.828334 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7pjs/crc-debug-z6858" Mar 20 16:25:52 crc kubenswrapper[4708]: I0320 16:25:52.829904 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-g7pjs"/"default-dockercfg-6gfrc" Mar 20 16:25:52 crc kubenswrapper[4708]: I0320 16:25:52.877577 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1478909-a439-434e-88b7-82801add9b09-host\") pod \"crc-debug-z6858\" (UID: \"a1478909-a439-434e-88b7-82801add9b09\") " pod="openshift-must-gather-g7pjs/crc-debug-z6858" Mar 20 16:25:52 crc kubenswrapper[4708]: I0320 16:25:52.877737 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr2rq\" (UniqueName: \"kubernetes.io/projected/a1478909-a439-434e-88b7-82801add9b09-kube-api-access-zr2rq\") pod \"crc-debug-z6858\" (UID: \"a1478909-a439-434e-88b7-82801add9b09\") " pod="openshift-must-gather-g7pjs/crc-debug-z6858" Mar 20 16:25:52 crc kubenswrapper[4708]: I0320 16:25:52.979266 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1478909-a439-434e-88b7-82801add9b09-host\") pod \"crc-debug-z6858\" (UID: \"a1478909-a439-434e-88b7-82801add9b09\") " pod="openshift-must-gather-g7pjs/crc-debug-z6858" Mar 20 16:25:52 crc kubenswrapper[4708]: I0320 16:25:52.979371 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr2rq\" (UniqueName: \"kubernetes.io/projected/a1478909-a439-434e-88b7-82801add9b09-kube-api-access-zr2rq\") pod \"crc-debug-z6858\" (UID: \"a1478909-a439-434e-88b7-82801add9b09\") " pod="openshift-must-gather-g7pjs/crc-debug-z6858" Mar 20 16:25:52 crc kubenswrapper[4708]: I0320 16:25:52.979780 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1478909-a439-434e-88b7-82801add9b09-host\") pod \"crc-debug-z6858\" (UID: \"a1478909-a439-434e-88b7-82801add9b09\") " pod="openshift-must-gather-g7pjs/crc-debug-z6858" Mar 20 16:25:53 crc kubenswrapper[4708]: I0320 16:25:53.004826 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr2rq\" (UniqueName: \"kubernetes.io/projected/a1478909-a439-434e-88b7-82801add9b09-kube-api-access-zr2rq\") pod \"crc-debug-z6858\" (UID: \"a1478909-a439-434e-88b7-82801add9b09\") " pod="openshift-must-gather-g7pjs/crc-debug-z6858" Mar 20 16:25:53 crc kubenswrapper[4708]: I0320 16:25:53.157687 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7pjs/crc-debug-z6858" Mar 20 16:25:53 crc kubenswrapper[4708]: W0320 16:25:53.190875 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1478909_a439_434e_88b7_82801add9b09.slice/crio-0bc81110e5e4ae49282f3b11cc5e802d5234681bbfc15ae7fab7170a28b7bc6b WatchSource:0}: Error finding container 0bc81110e5e4ae49282f3b11cc5e802d5234681bbfc15ae7fab7170a28b7bc6b: Status 404 returned error can't find the container with id 0bc81110e5e4ae49282f3b11cc5e802d5234681bbfc15ae7fab7170a28b7bc6b Mar 20 16:25:54 crc kubenswrapper[4708]: I0320 16:25:54.091361 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7pjs/crc-debug-z6858" event={"ID":"a1478909-a439-434e-88b7-82801add9b09","Type":"ContainerStarted","Data":"0bc81110e5e4ae49282f3b11cc5e802d5234681bbfc15ae7fab7170a28b7bc6b"} Mar 20 16:26:00 crc kubenswrapper[4708]: I0320 16:26:00.151971 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567066-thfp9"] Mar 20 16:26:00 crc kubenswrapper[4708]: I0320 16:26:00.154049 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-thfp9" Mar 20 16:26:00 crc kubenswrapper[4708]: I0320 16:26:00.156828 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:26:00 crc kubenswrapper[4708]: I0320 16:26:00.157085 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:26:00 crc kubenswrapper[4708]: I0320 16:26:00.157236 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:26:00 crc kubenswrapper[4708]: I0320 16:26:00.169353 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-thfp9"] Mar 20 16:26:00 crc kubenswrapper[4708]: I0320 16:26:00.222660 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6424f\" (UniqueName: \"kubernetes.io/projected/3749e625-ee58-48e8-858d-2b95c1e33b05-kube-api-access-6424f\") pod \"auto-csr-approver-29567066-thfp9\" (UID: \"3749e625-ee58-48e8-858d-2b95c1e33b05\") " pod="openshift-infra/auto-csr-approver-29567066-thfp9" Mar 20 16:26:00 crc kubenswrapper[4708]: I0320 16:26:00.324770 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6424f\" (UniqueName: \"kubernetes.io/projected/3749e625-ee58-48e8-858d-2b95c1e33b05-kube-api-access-6424f\") pod \"auto-csr-approver-29567066-thfp9\" (UID: \"3749e625-ee58-48e8-858d-2b95c1e33b05\") " pod="openshift-infra/auto-csr-approver-29567066-thfp9" Mar 20 16:26:00 crc kubenswrapper[4708]: I0320 16:26:00.345113 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6424f\" (UniqueName: \"kubernetes.io/projected/3749e625-ee58-48e8-858d-2b95c1e33b05-kube-api-access-6424f\") pod \"auto-csr-approver-29567066-thfp9\" (UID: \"3749e625-ee58-48e8-858d-2b95c1e33b05\") " pod="openshift-infra/auto-csr-approver-29567066-thfp9" Mar 20 16:26:00 crc kubenswrapper[4708]: I0320 16:26:00.488858 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-thfp9" Mar 20 16:26:07 crc kubenswrapper[4708]: I0320 16:26:07.426342 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-thfp9"] Mar 20 16:26:08 crc kubenswrapper[4708]: I0320 16:26:08.253374 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567066-thfp9" event={"ID":"3749e625-ee58-48e8-858d-2b95c1e33b05","Type":"ContainerStarted","Data":"42f2e7e19b50051adfa468653c946c53e222171c4dbd29fb256ed3baec941a90"} Mar 20 16:26:08 crc kubenswrapper[4708]: I0320 16:26:08.255893 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7pjs/crc-debug-z6858" event={"ID":"a1478909-a439-434e-88b7-82801add9b09","Type":"ContainerStarted","Data":"2aca58a32a92db81355b47c8ed675e75e260e375a20e54a8ebb618368e73cf81"} Mar 20 16:26:08 crc kubenswrapper[4708]: I0320 16:26:08.275475 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g7pjs/crc-debug-z6858" podStartSLOduration=2.470629849 podStartE2EDuration="16.275445622s" podCreationTimestamp="2026-03-20 16:25:52 +0000 UTC" firstStartedPulling="2026-03-20 16:25:53.193187822 +0000 UTC m=+1507.867524547" lastFinishedPulling="2026-03-20 16:26:06.998003605 +0000 UTC m=+1521.672340320" observedRunningTime="2026-03-20 16:26:08.266983818 +0000 UTC m=+1522.941320553" watchObservedRunningTime="2026-03-20 16:26:08.275445622 +0000 UTC m=+1522.949782337" Mar 20 16:26:09 crc kubenswrapper[4708]: I0320 16:26:09.268468 4708 generic.go:334] "Generic (PLEG): container finished" podID="3749e625-ee58-48e8-858d-2b95c1e33b05" containerID="0214912007e228e729cb351ec91fad5c2f6ba03158fcc635b3bae094ab2c476a" exitCode=0 Mar 20 16:26:09 crc kubenswrapper[4708]: I0320 16:26:09.268530 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567066-thfp9" event={"ID":"3749e625-ee58-48e8-858d-2b95c1e33b05","Type":"ContainerDied","Data":"0214912007e228e729cb351ec91fad5c2f6ba03158fcc635b3bae094ab2c476a"} Mar 20 16:26:10 crc kubenswrapper[4708]: I0320 16:26:10.629801 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-thfp9" Mar 20 16:26:10 crc kubenswrapper[4708]: I0320 16:26:10.747528 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6424f\" (UniqueName: \"kubernetes.io/projected/3749e625-ee58-48e8-858d-2b95c1e33b05-kube-api-access-6424f\") pod \"3749e625-ee58-48e8-858d-2b95c1e33b05\" (UID: \"3749e625-ee58-48e8-858d-2b95c1e33b05\") " Mar 20 16:26:10 crc kubenswrapper[4708]: I0320 16:26:10.758129 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3749e625-ee58-48e8-858d-2b95c1e33b05-kube-api-access-6424f" (OuterVolumeSpecName: "kube-api-access-6424f") pod "3749e625-ee58-48e8-858d-2b95c1e33b05" (UID: "3749e625-ee58-48e8-858d-2b95c1e33b05"). InnerVolumeSpecName "kube-api-access-6424f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:26:10 crc kubenswrapper[4708]: I0320 16:26:10.850496 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6424f\" (UniqueName: \"kubernetes.io/projected/3749e625-ee58-48e8-858d-2b95c1e33b05-kube-api-access-6424f\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:11 crc kubenswrapper[4708]: I0320 16:26:11.293357 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567066-thfp9" event={"ID":"3749e625-ee58-48e8-858d-2b95c1e33b05","Type":"ContainerDied","Data":"42f2e7e19b50051adfa468653c946c53e222171c4dbd29fb256ed3baec941a90"} Mar 20 16:26:11 crc kubenswrapper[4708]: I0320 16:26:11.293683 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42f2e7e19b50051adfa468653c946c53e222171c4dbd29fb256ed3baec941a90" Mar 20 16:26:11 crc kubenswrapper[4708]: I0320 16:26:11.293429 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-thfp9" Mar 20 16:26:11 crc kubenswrapper[4708]: I0320 16:26:11.716278 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-smkpv"] Mar 20 16:26:11 crc kubenswrapper[4708]: I0320 16:26:11.738371 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-smkpv"] Mar 20 16:26:12 crc kubenswrapper[4708]: I0320 16:26:12.124211 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e68e520-8128-47bf-9e19-20ee31ecbcad" path="/var/lib/kubelet/pods/6e68e520-8128-47bf-9e19-20ee31ecbcad/volumes" Mar 20 16:26:17 crc kubenswrapper[4708]: I0320 16:26:17.071293 4708 scope.go:117] "RemoveContainer" containerID="3bce7486a1dc6a240390efd4a52deb80f3738f57f67d941bea0386d021a24955" Mar 20 16:26:26 crc kubenswrapper[4708]: I0320 16:26:26.178628 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:26:26 crc kubenswrapper[4708]: I0320 16:26:26.179185 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:26:38 crc kubenswrapper[4708]: I0320 16:26:38.750708 4708 generic.go:334] "Generic (PLEG): container finished" podID="a1478909-a439-434e-88b7-82801add9b09" containerID="2aca58a32a92db81355b47c8ed675e75e260e375a20e54a8ebb618368e73cf81" exitCode=0 Mar 20 16:26:38 crc kubenswrapper[4708]: I0320 16:26:38.750799 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7pjs/crc-debug-z6858" event={"ID":"a1478909-a439-434e-88b7-82801add9b09","Type":"ContainerDied","Data":"2aca58a32a92db81355b47c8ed675e75e260e375a20e54a8ebb618368e73cf81"} Mar 20 16:26:39 crc kubenswrapper[4708]: I0320 16:26:39.879422 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7pjs/crc-debug-z6858" Mar 20 16:26:39 crc kubenswrapper[4708]: I0320 16:26:39.916144 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g7pjs/crc-debug-z6858"] Mar 20 16:26:39 crc kubenswrapper[4708]: I0320 16:26:39.927866 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g7pjs/crc-debug-z6858"] Mar 20 16:26:39 crc kubenswrapper[4708]: I0320 16:26:39.982499 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr2rq\" (UniqueName: \"kubernetes.io/projected/a1478909-a439-434e-88b7-82801add9b09-kube-api-access-zr2rq\") pod \"a1478909-a439-434e-88b7-82801add9b09\" (UID: \"a1478909-a439-434e-88b7-82801add9b09\") " Mar 20 16:26:39 crc kubenswrapper[4708]: I0320 16:26:39.983003 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1478909-a439-434e-88b7-82801add9b09-host\") pod \"a1478909-a439-434e-88b7-82801add9b09\" (UID: \"a1478909-a439-434e-88b7-82801add9b09\") " Mar 20 16:26:39 crc kubenswrapper[4708]: I0320 16:26:39.983081 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1478909-a439-434e-88b7-82801add9b09-host" (OuterVolumeSpecName: "host") pod "a1478909-a439-434e-88b7-82801add9b09" (UID: "a1478909-a439-434e-88b7-82801add9b09"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:26:39 crc kubenswrapper[4708]: I0320 16:26:39.983804 4708 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1478909-a439-434e-88b7-82801add9b09-host\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:39 crc kubenswrapper[4708]: I0320 16:26:39.995601 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1478909-a439-434e-88b7-82801add9b09-kube-api-access-zr2rq" (OuterVolumeSpecName: "kube-api-access-zr2rq") pod "a1478909-a439-434e-88b7-82801add9b09" (UID: "a1478909-a439-434e-88b7-82801add9b09"). InnerVolumeSpecName "kube-api-access-zr2rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:26:40 crc kubenswrapper[4708]: I0320 16:26:40.085522 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr2rq\" (UniqueName: \"kubernetes.io/projected/a1478909-a439-434e-88b7-82801add9b09-kube-api-access-zr2rq\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:40 crc kubenswrapper[4708]: I0320 16:26:40.122397 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1478909-a439-434e-88b7-82801add9b09" path="/var/lib/kubelet/pods/a1478909-a439-434e-88b7-82801add9b09/volumes" Mar 20 16:26:40 crc kubenswrapper[4708]: I0320 16:26:40.771412 4708 scope.go:117] "RemoveContainer" containerID="2aca58a32a92db81355b47c8ed675e75e260e375a20e54a8ebb618368e73cf81" Mar 20 16:26:40 crc kubenswrapper[4708]: I0320 16:26:40.771505 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7pjs/crc-debug-z6858" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.119289 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g7pjs/crc-debug-n22dt"] Mar 20 16:26:41 crc kubenswrapper[4708]: E0320 16:26:41.120108 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3749e625-ee58-48e8-858d-2b95c1e33b05" containerName="oc" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.120124 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="3749e625-ee58-48e8-858d-2b95c1e33b05" containerName="oc" Mar 20 16:26:41 crc kubenswrapper[4708]: E0320 16:26:41.120151 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1478909-a439-434e-88b7-82801add9b09" containerName="container-00" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.120160 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1478909-a439-434e-88b7-82801add9b09" containerName="container-00" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.120407 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1478909-a439-434e-88b7-82801add9b09" containerName="container-00" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.120428 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="3749e625-ee58-48e8-858d-2b95c1e33b05" containerName="oc" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.121296 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7pjs/crc-debug-n22dt" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.124228 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-g7pjs"/"default-dockercfg-6gfrc" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.207903 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5frgt\" (UniqueName: \"kubernetes.io/projected/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef-kube-api-access-5frgt\") pod \"crc-debug-n22dt\" (UID: \"076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef\") " pod="openshift-must-gather-g7pjs/crc-debug-n22dt" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.208051 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef-host\") pod \"crc-debug-n22dt\" (UID: \"076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef\") " pod="openshift-must-gather-g7pjs/crc-debug-n22dt" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.310954 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef-host\") pod \"crc-debug-n22dt\" (UID: \"076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef\") " pod="openshift-must-gather-g7pjs/crc-debug-n22dt" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.311071 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5frgt\" (UniqueName: \"kubernetes.io/projected/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef-kube-api-access-5frgt\") pod \"crc-debug-n22dt\" (UID: \"076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef\") " pod="openshift-must-gather-g7pjs/crc-debug-n22dt" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.311090 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef-host\") pod \"crc-debug-n22dt\" (UID: \"076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef\") " pod="openshift-must-gather-g7pjs/crc-debug-n22dt" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.345597 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5frgt\" (UniqueName: \"kubernetes.io/projected/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef-kube-api-access-5frgt\") pod \"crc-debug-n22dt\" (UID: \"076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef\") " pod="openshift-must-gather-g7pjs/crc-debug-n22dt" Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.452177 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7pjs/crc-debug-n22dt" Mar 20 16:26:41 crc kubenswrapper[4708]: W0320 16:26:41.506270 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod076cdbbe_d5f3_41f7_8591_2b0f4c4a3fef.slice/crio-8065b74baa737953d13444f486839554fa201ae46183da4113ca81d92f20b34c WatchSource:0}: Error finding container 8065b74baa737953d13444f486839554fa201ae46183da4113ca81d92f20b34c: Status 404 returned error can't find the container with id 8065b74baa737953d13444f486839554fa201ae46183da4113ca81d92f20b34c Mar 20 16:26:41 crc kubenswrapper[4708]: I0320 16:26:41.780763 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7pjs/crc-debug-n22dt" event={"ID":"076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef","Type":"ContainerStarted","Data":"8065b74baa737953d13444f486839554fa201ae46183da4113ca81d92f20b34c"} Mar 20 16:26:42 crc kubenswrapper[4708]: I0320 16:26:42.791887 4708 generic.go:334] "Generic (PLEG): container finished" podID="076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef" containerID="54b846849bcbfc37005959bd5200ba5a0f28647c4ad9e6c9f7cd3cb9ab86f120" exitCode=1 Mar 20 16:26:42 crc kubenswrapper[4708]: I0320 16:26:42.791936 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7pjs/crc-debug-n22dt" event={"ID":"076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef","Type":"ContainerDied","Data":"54b846849bcbfc37005959bd5200ba5a0f28647c4ad9e6c9f7cd3cb9ab86f120"} Mar 20 16:26:42 crc kubenswrapper[4708]: I0320 16:26:42.849114 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g7pjs/crc-debug-n22dt"] Mar 20 16:26:42 crc kubenswrapper[4708]: I0320 16:26:42.862055 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g7pjs/crc-debug-n22dt"] Mar 20 16:26:43 crc kubenswrapper[4708]: I0320 16:26:43.929219 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7pjs/crc-debug-n22dt" Mar 20 16:26:44 crc kubenswrapper[4708]: I0320 16:26:44.074421 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef-host" (OuterVolumeSpecName: "host") pod "076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef" (UID: "076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 16:26:44 crc kubenswrapper[4708]: I0320 16:26:44.076273 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef-host\") pod \"076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef\" (UID: \"076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef\") " Mar 20 16:26:44 crc kubenswrapper[4708]: I0320 16:26:44.076514 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5frgt\" (UniqueName: \"kubernetes.io/projected/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef-kube-api-access-5frgt\") pod \"076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef\" (UID: \"076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef\") " Mar 20 16:26:44 crc kubenswrapper[4708]: I0320 16:26:44.077960 4708 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef-host\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:44 crc kubenswrapper[4708]: I0320 16:26:44.091011 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef-kube-api-access-5frgt" (OuterVolumeSpecName: "kube-api-access-5frgt") pod "076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef" (UID: "076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef"). InnerVolumeSpecName "kube-api-access-5frgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:26:44 crc kubenswrapper[4708]: I0320 16:26:44.133255 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef" path="/var/lib/kubelet/pods/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef/volumes" Mar 20 16:26:44 crc kubenswrapper[4708]: I0320 16:26:44.182337 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5frgt\" (UniqueName: \"kubernetes.io/projected/076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef-kube-api-access-5frgt\") on node \"crc\" DevicePath \"\"" Mar 20 16:26:44 crc kubenswrapper[4708]: I0320 16:26:44.811905 4708 scope.go:117] "RemoveContainer" containerID="54b846849bcbfc37005959bd5200ba5a0f28647c4ad9e6c9f7cd3cb9ab86f120" Mar 20 16:26:44 crc kubenswrapper[4708]: I0320 16:26:44.811947 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7pjs/crc-debug-n22dt" Mar 20 16:26:56 crc kubenswrapper[4708]: I0320 16:26:56.178787 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:26:56 crc kubenswrapper[4708]: I0320 16:26:56.179250 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:27:05 crc kubenswrapper[4708]: I0320 16:27:05.216632 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-5805-account-create-update-2qn7m_4e28ace2-9e11-4223-b58d-91688cd2ced4/mariadb-account-create-update/0.log" Mar 20 16:27:05 crc kubenswrapper[4708]: I0320 16:27:05.404616 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c9d458d9b-7t7xq_e3c707e2-14a0-493e-88f8-81760da73840/barbican-api/0.log" Mar 20 16:27:05 crc kubenswrapper[4708]: I0320 16:27:05.459962 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c9d458d9b-7t7xq_e3c707e2-14a0-493e-88f8-81760da73840/barbican-api-log/0.log" Mar 20 16:27:05 crc kubenswrapper[4708]: I0320 16:27:05.600509 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-create-nbvxk_3a9a8c73-828e-42ef-9818-6aab510e8240/mariadb-database-create/0.log" Mar 20 16:27:05 crc kubenswrapper[4708]: I0320 16:27:05.658931 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-sync-zk9cx_204070bf-f103-49d9-b366-185454e68b9e/barbican-db-sync/0.log" Mar 20 16:27:05 crc kubenswrapper[4708]: I0320 16:27:05.759910 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f5dc8d9b6-dv4pg_8868c62f-1325-4541-96b6-57a48f5b045e/barbican-keystone-listener/0.log" Mar 20 16:27:05 crc kubenswrapper[4708]: I0320 16:27:05.937420 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f5dc8d9b6-dv4pg_8868c62f-1325-4541-96b6-57a48f5b045e/barbican-keystone-listener-log/0.log" Mar 20 16:27:05 crc kubenswrapper[4708]: I0320 16:27:05.993689 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7cf76bb97c-b4rrf_f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9/barbican-worker/0.log" Mar 20 16:27:06 crc kubenswrapper[4708]: I0320 16:27:06.034770 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7cf76bb97c-b4rrf_f1b82e86-4f8a-4d9c-8b0c-97b8b34cf8f9/barbican-worker-log/0.log" Mar 20 16:27:06 crc kubenswrapper[4708]: I0320 16:27:06.210393 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2f87093e-f826-4a63-b47f-60c6fba18500/ceilometer-notification-agent/0.log" Mar 20 16:27:06 crc kubenswrapper[4708]: I0320 16:27:06.231196 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2f87093e-f826-4a63-b47f-60c6fba18500/ceilometer-central-agent/0.log" Mar 20 16:27:06 crc kubenswrapper[4708]: I0320 16:27:06.258459 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2f87093e-f826-4a63-b47f-60c6fba18500/proxy-httpd/0.log" Mar 20 16:27:06 crc kubenswrapper[4708]: I0320 16:27:06.366046 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2f87093e-f826-4a63-b47f-60c6fba18500/sg-core/0.log" Mar 20 16:27:06 crc kubenswrapper[4708]: I0320 16:27:06.391896 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-42e7-account-create-update-jmvzr_949acc8b-4603-4567-9864-0328462133a0/mariadb-account-create-update/0.log" Mar 20 16:27:06 crc kubenswrapper[4708]: I0320 16:27:06.796417 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b5c84402-2274-4aa9-a456-4b936ba6b94e/cinder-api-log/0.log" Mar 20 16:27:06 crc kubenswrapper[4708]: I0320 16:27:06.842767 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b5c84402-2274-4aa9-a456-4b936ba6b94e/cinder-api/0.log" Mar 20 16:27:06 crc kubenswrapper[4708]: I0320 16:27:06.934577 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-create-qnn97_15badc78-5ab8-41aa-acfb-4bb1f28bcbab/mariadb-database-create/0.log" Mar 20 16:27:07 crc kubenswrapper[4708]: I0320 16:27:07.066119 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-sync-fgjlj_c46a759f-98ab-495d-9cab-ba1f2fbbb112/cinder-db-sync/0.log" Mar 20 16:27:07 crc kubenswrapper[4708]: I0320 16:27:07.158040 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c5c890eb-8583-4d2f-bf71-55e9f83b51d6/cinder-scheduler/0.log" Mar 20 16:27:07 crc kubenswrapper[4708]: I0320 16:27:07.232823 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c5c890eb-8583-4d2f-bf71-55e9f83b51d6/probe/0.log" Mar 20 16:27:07 crc kubenswrapper[4708]: I0320 16:27:07.329508 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-mkqbz_263a49a4-a2a6-4d75-82b5-cb508abfe752/init/0.log" Mar 20 16:27:07 crc kubenswrapper[4708]: I0320 16:27:07.556199 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-mkqbz_263a49a4-a2a6-4d75-82b5-cb508abfe752/init/0.log" Mar 20 16:27:07 crc kubenswrapper[4708]: I0320 16:27:07.569632 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-mkqbz_263a49a4-a2a6-4d75-82b5-cb508abfe752/dnsmasq-dns/0.log" Mar 20 16:27:07 crc kubenswrapper[4708]: I0320 16:27:07.585384 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-3dc6-account-create-update-cqsnp_5609997e-9b3b-4472-a91a-0948eacb77f1/mariadb-account-create-update/0.log" Mar 20 16:27:07 crc kubenswrapper[4708]: I0320 16:27:07.796926 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-create-dw2jc_720c033b-2069-47be-b543-00c6005b496b/mariadb-database-create/0.log" Mar 20 16:27:07 crc kubenswrapper[4708]: I0320 16:27:07.864226 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-sync-rpnp8_e73e6a53-ccd4-45bf-ad96-a6de1e696888/glance-db-sync/0.log" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.001963 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2842ffef-b14c-48e2-8cf5-cb9aee2d1131/glance-httpd/0.log" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.061227 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2842ffef-b14c-48e2-8cf5-cb9aee2d1131/glance-log/0.log" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.232362 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c01a244f-263d-435a-90d8-35bdb111bb6a/glance-httpd/0.log" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.281081 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c01a244f-263d-435a-90d8-35bdb111bb6a/glance-log/0.log" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.431990 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7bc9dd67b8-mz4lv_15901de5-ddbe-4c7b-8968-8c614619be4d/horizon/0.log" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.477663 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7bc9dd67b8-mz4lv_15901de5-ddbe-4c7b-8968-8c614619be4d/horizon-log/0.log" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.627758 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7bd9698484-kk2kq_c6ba411f-6368-44dd-a104-84c141bd9092/keystone-api/0.log" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.716912 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-930e-account-create-update-nzvsx_799c71bd-9b76-4d2d-a4b9-6953e7ee2863/mariadb-account-create-update/0.log" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.820868 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-69wwd"] Mar 20 16:27:08 crc kubenswrapper[4708]: E0320 16:27:08.821327 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef" containerName="container-00" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.821344 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef" containerName="container-00" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.821534 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="076cdbbe-d5f3-41f7-8591-2b0f4c4a3fef" containerName="container-00" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.822924 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.832442 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-69wwd"] Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.867861 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2c46d0-f762-4645-bbdb-093d7622c5fe-utilities\") pod \"redhat-marketplace-69wwd\" (UID: \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\") " pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.868089 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh7tk\" (UniqueName: \"kubernetes.io/projected/fc2c46d0-f762-4645-bbdb-093d7622c5fe-kube-api-access-jh7tk\") pod \"redhat-marketplace-69wwd\" (UID: \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\") " pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.868151 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2c46d0-f762-4645-bbdb-093d7622c5fe-catalog-content\") pod \"redhat-marketplace-69wwd\" (UID: \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\") " pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.900203 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bootstrap-q5htg_341b59c3-684f-45e4-9d42-ed258e0e671b/keystone-bootstrap/0.log" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.970519 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh7tk\" (UniqueName: \"kubernetes.io/projected/fc2c46d0-f762-4645-bbdb-093d7622c5fe-kube-api-access-jh7tk\") pod \"redhat-marketplace-69wwd\" (UID: \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\") " pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.970612 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2c46d0-f762-4645-bbdb-093d7622c5fe-catalog-content\") pod \"redhat-marketplace-69wwd\" (UID: \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\") " pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.970667 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2c46d0-f762-4645-bbdb-093d7622c5fe-utilities\") pod \"redhat-marketplace-69wwd\" (UID: \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\") " pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.971355 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2c46d0-f762-4645-bbdb-093d7622c5fe-utilities\") pod \"redhat-marketplace-69wwd\" (UID: \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\") " pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:08 crc kubenswrapper[4708]: I0320 16:27:08.971721 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2c46d0-f762-4645-bbdb-093d7622c5fe-catalog-content\") pod \"redhat-marketplace-69wwd\" (UID: \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\") " pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:09 crc kubenswrapper[4708]: I0320 16:27:09.045655 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh7tk\" (UniqueName: \"kubernetes.io/projected/fc2c46d0-f762-4645-bbdb-093d7622c5fe-kube-api-access-jh7tk\") pod \"redhat-marketplace-69wwd\" (UID: \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\") " pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:09 crc kubenswrapper[4708]: I0320 16:27:09.099771 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-create-p756b_50a31e27-283d-43b0-91b5-71c548d61e27/mariadb-database-create/0.log" Mar 20 16:27:09 crc kubenswrapper[4708]: I0320 16:27:09.159744 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:09 crc kubenswrapper[4708]: I0320 16:27:09.304302 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-sync-z2bxx_8d5e7b3a-c1c7-493a-a587-19d751f038be/keystone-db-sync/0.log" Mar 20 16:27:09 crc kubenswrapper[4708]: I0320 16:27:09.445646 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1c5ac3f4-8b13-4d9b-98f5-a5c42ac4dbab/kube-state-metrics/0.log" Mar 20 16:27:09 crc kubenswrapper[4708]: I0320 16:27:09.708546 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-69wwd"] Mar 20 16:27:09 crc kubenswrapper[4708]: I0320 16:27:09.855811 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5bed-account-create-update-8dhzm_247ec5aa-1401-422a-b9f7-71c8c9b4876e/mariadb-account-create-update/0.log" Mar 20 16:27:09 crc kubenswrapper[4708]: I0320 16:27:09.985053 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9f694fd9c-lggtm_90f720ed-ded0-464e-9691-8f83c10700b0/neutron-api/0.log" Mar 20 16:27:10 crc kubenswrapper[4708]: I0320 16:27:10.058142 4708 generic.go:334] "Generic (PLEG): container finished" podID="fc2c46d0-f762-4645-bbdb-093d7622c5fe" containerID="605999be8876beff295beb8b07d5830897ca334eafa339629e711a00f56cdaee" exitCode=0 Mar 20 16:27:10 crc kubenswrapper[4708]: I0320 16:27:10.058190 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69wwd" event={"ID":"fc2c46d0-f762-4645-bbdb-093d7622c5fe","Type":"ContainerDied","Data":"605999be8876beff295beb8b07d5830897ca334eafa339629e711a00f56cdaee"} Mar 20 16:27:10 crc kubenswrapper[4708]: I0320 16:27:10.058237 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69wwd" event={"ID":"fc2c46d0-f762-4645-bbdb-093d7622c5fe","Type":"ContainerStarted","Data":"ae2e55c7b1a9a7967004a1385ca53cd1fb24be1984e4238704b7545f09a832b4"} Mar 20 16:27:10 crc kubenswrapper[4708]: I0320 16:27:10.093336 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9f694fd9c-lggtm_90f720ed-ded0-464e-9691-8f83c10700b0/neutron-httpd/0.log" Mar 20 16:27:10 crc kubenswrapper[4708]: I0320 16:27:10.189096 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-create-dg24d_1b15f38d-ded0-4fed-add0-c891d2208014/mariadb-database-create/0.log" Mar 20 16:27:10 crc kubenswrapper[4708]: I0320 16:27:10.376383 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-sync-sr6vd_52e4d34b-0c95-475c-b9e5-be1dff27d5a3/neutron-db-sync/0.log" Mar 20 16:27:10 crc kubenswrapper[4708]: I0320 16:27:10.548404 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_24d52f23-40eb-4b2f-8c6d-83b887450123/nova-api-api/0.log" Mar 20 16:27:10 crc kubenswrapper[4708]: I0320 16:27:10.569820 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_24d52f23-40eb-4b2f-8c6d-83b887450123/nova-api-log/0.log" Mar 20 16:27:10 crc kubenswrapper[4708]: I0320 16:27:10.706712 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0234-account-create-update-krdbf_394a21a5-81ce-4b43-8642-70a03a4a0685/mariadb-account-create-update/0.log" Mar 20 16:27:10 crc kubenswrapper[4708]: I0320 16:27:10.751244 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-db-create-qc5ff_22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc/mariadb-database-create/0.log" Mar 20 16:27:11 crc kubenswrapper[4708]: I0320 16:27:11.016878 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-cell-mapping-2hpvr_af676e7f-5129-436c-9451-8a9b1c8c19c0/nova-manage/0.log" Mar 20 16:27:11 crc kubenswrapper[4708]: I0320 16:27:11.072104 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69wwd" event={"ID":"fc2c46d0-f762-4645-bbdb-093d7622c5fe","Type":"ContainerStarted","Data":"a99cbd70e419767b5163971162c98ab61b5a308fae039dd1b75cfbb9e5696ad1"} Mar 20 16:27:11 crc kubenswrapper[4708]: I0320 16:27:11.238201 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-db-sync-5dkqq_d679da4d-509e-49d9-a465-405bda8b3e2d/nova-cell0-conductor-db-sync/0.log" Mar 20 16:27:11 crc kubenswrapper[4708]: I0320 16:27:11.283154 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_281ab006-f27d-4cb5-9d26-8fde6cc40ab2/nova-cell0-conductor-conductor/0.log" Mar 20 16:27:11 crc kubenswrapper[4708]: I0320 16:27:11.413870 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-create-2vzqh_3f246f8c-2e08-400c-af52-746be688f708/mariadb-database-create/0.log" Mar 20 16:27:11 crc kubenswrapper[4708]: I0320 16:27:11.540915 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-f9fb-account-create-update-ngw6v_3116dd2a-d2d0-46cf-837d-56d29a7e116f/mariadb-account-create-update/0.log" Mar 20 16:27:11 crc kubenswrapper[4708]: I0320 16:27:11.686686 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-1d98-account-create-update-vvbxv_b0b66f79-c7ee-40c3-a026-0c42a0648f11/mariadb-account-create-update/0.log" Mar 20 16:27:11 crc kubenswrapper[4708]: I0320 16:27:11.772438 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-cell-mapping-nktb8_243ed677-1e35-40c7-ba55-a90eb9c7b85c/nova-manage/0.log" Mar 20 16:27:12 crc kubenswrapper[4708]: I0320 16:27:12.068381 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_dd84187f-6a03-4149-89b8-bc697c1ad82c/nova-cell1-conductor-conductor/0.log" Mar 20 16:27:12 crc kubenswrapper[4708]: I0320 16:27:12.069512 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-db-sync-zpnrd_384f206f-1142-4027-90ef-7adfeda8a5f5/nova-cell1-conductor-db-sync/0.log" Mar 20 16:27:12 crc kubenswrapper[4708]: I0320 16:27:12.083045 4708 generic.go:334] "Generic (PLEG): container finished" podID="fc2c46d0-f762-4645-bbdb-093d7622c5fe" containerID="a99cbd70e419767b5163971162c98ab61b5a308fae039dd1b75cfbb9e5696ad1" exitCode=0 Mar 20 16:27:12 crc kubenswrapper[4708]: I0320 16:27:12.083098 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69wwd" event={"ID":"fc2c46d0-f762-4645-bbdb-093d7622c5fe","Type":"ContainerDied","Data":"a99cbd70e419767b5163971162c98ab61b5a308fae039dd1b75cfbb9e5696ad1"} Mar 20 16:27:12 crc kubenswrapper[4708]: I0320 16:27:12.248877 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-create-wwl4c_630fc775-bda7-45ac-9852-650855479072/mariadb-database-create/0.log" Mar 20 16:27:12 crc kubenswrapper[4708]: I0320 16:27:12.322946 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0e151c63-d78f-4799-9489-09c4d91cb4ab/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 16:27:12 crc kubenswrapper[4708]: I0320 16:27:12.601113 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_30d5915f-5d5c-4763-a550-0c1ac74086a2/nova-metadata-log/0.log" Mar 20 16:27:12 crc kubenswrapper[4708]: I0320 16:27:12.751143 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_30d5915f-5d5c-4763-a550-0c1ac74086a2/nova-metadata-metadata/0.log" Mar 20 16:27:12 crc kubenswrapper[4708]: I0320 16:27:12.888297 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_98e85ea9-8f00-458b-9016-ef5c4b9569f7/mysql-bootstrap/0.log" Mar 20 16:27:12 crc kubenswrapper[4708]: I0320 16:27:12.888775 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6baef84a-7b0b-4a61-9f68-691ccd14f3a6/nova-scheduler-scheduler/0.log" Mar 20 16:27:13 crc kubenswrapper[4708]: I0320 16:27:13.094085 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69wwd" event={"ID":"fc2c46d0-f762-4645-bbdb-093d7622c5fe","Type":"ContainerStarted","Data":"20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b"} Mar 20 16:27:13 crc kubenswrapper[4708]: I0320 16:27:13.111325 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-69wwd" podStartSLOduration=2.667787202 podStartE2EDuration="5.111304981s" podCreationTimestamp="2026-03-20 16:27:08 +0000 UTC" firstStartedPulling="2026-03-20 16:27:10.06161438 +0000 UTC m=+1584.735951095" lastFinishedPulling="2026-03-20 16:27:12.505132159 +0000 UTC m=+1587.179468874" observedRunningTime="2026-03-20 16:27:13.109641685 +0000 UTC m=+1587.783978420" watchObservedRunningTime="2026-03-20 16:27:13.111304981 +0000 UTC m=+1587.785641706" Mar 20 16:27:13 crc kubenswrapper[4708]: I0320 16:27:13.303383 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_98e85ea9-8f00-458b-9016-ef5c4b9569f7/mysql-bootstrap/0.log" Mar 20 16:27:13 crc kubenswrapper[4708]: I0320 16:27:13.343816 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_052c07b4-fc8c-45df-9294-d6217de2f52c/mysql-bootstrap/0.log" Mar 20 16:27:13 crc kubenswrapper[4708]: I0320 16:27:13.362966 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_98e85ea9-8f00-458b-9016-ef5c4b9569f7/galera/0.log" Mar 20 16:27:13 crc kubenswrapper[4708]: I0320 16:27:13.687533 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_08cb3912-b1b4-40bb-a815-7c7ca540f327/openstackclient/0.log" Mar 20 16:27:13 crc kubenswrapper[4708]: I0320 16:27:13.690934 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_052c07b4-fc8c-45df-9294-d6217de2f52c/mysql-bootstrap/0.log" Mar 20 16:27:13 crc kubenswrapper[4708]: I0320 16:27:13.748774 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_052c07b4-fc8c-45df-9294-d6217de2f52c/galera/0.log" Mar 20 16:27:13 crc kubenswrapper[4708]: I0320 16:27:13.931836 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-22qr2_8ad8d5cb-c681-406d-8dee-25f0a0f71b83/ovn-controller/0.log" Mar 20 16:27:14 crc kubenswrapper[4708]: I0320 16:27:14.147711 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-42bhb_459658ac-d380-46ac-9cec-377e902eba9c/openstack-network-exporter/0.log" Mar 20 16:27:14 crc kubenswrapper[4708]: I0320 16:27:14.186072 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-w47cd_c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0/ovsdb-server-init/0.log" Mar 20 16:27:14 crc kubenswrapper[4708]: I0320 16:27:14.420929 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-w47cd_c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0/ovs-vswitchd/0.log" Mar 20 16:27:14 crc kubenswrapper[4708]: I0320 16:27:14.431852 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-w47cd_c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0/ovsdb-server-init/0.log" Mar 20 16:27:14 crc kubenswrapper[4708]: I0320 16:27:14.597863 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-w47cd_c7928aef-7fd4-45ee-b4a3-7f968d3e3ce0/ovsdb-server/0.log" Mar 20 16:27:14 crc kubenswrapper[4708]: I0320 16:27:14.711648 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5cde40d5-06a2-425d-a9c2-a00b10cc3148/openstack-network-exporter/0.log" Mar 20 16:27:14 crc kubenswrapper[4708]: I0320 16:27:14.724708 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5cde40d5-06a2-425d-a9c2-a00b10cc3148/ovn-northd/0.log" Mar 20 16:27:14 crc kubenswrapper[4708]: I0320 16:27:14.837944 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9db73214-be2f-4e2e-b703-8cc42aa0c86a/openstack-network-exporter/0.log" Mar 20 16:27:15 crc kubenswrapper[4708]: I0320 16:27:15.045137 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9db73214-be2f-4e2e-b703-8cc42aa0c86a/ovsdbserver-nb/0.log" Mar 20 16:27:15 crc kubenswrapper[4708]: I0320 16:27:15.086449 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bdb983a7-d139-4a8b-bbb5-6e65999c6be5/openstack-network-exporter/0.log" Mar 20 16:27:15 crc kubenswrapper[4708]: I0320 16:27:15.142678 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bdb983a7-d139-4a8b-bbb5-6e65999c6be5/ovsdbserver-sb/0.log" Mar 20 16:27:15 crc kubenswrapper[4708]: I0320 16:27:15.289285 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65454bf644-7xssx_a74b0f3e-01c6-478c-9691-f48fab32af12/placement-api/0.log" Mar 20 16:27:15 crc kubenswrapper[4708]: I0320 16:27:15.316157 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65454bf644-7xssx_a74b0f3e-01c6-478c-9691-f48fab32af12/placement-log/0.log" Mar 20 16:27:15 crc kubenswrapper[4708]: I0320 16:27:15.506693 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8c1b-account-create-update-r92gp_e5e322f6-c3f7-4870-84ae-f27c1d4ba293/mariadb-account-create-update/0.log" Mar 20 16:27:15 crc kubenswrapper[4708]: I0320 16:27:15.590854 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-create-h7q7q_c03712fd-33bf-454a-a7f7-f907e9b9c0ec/mariadb-database-create/0.log" Mar 20 16:27:15 crc kubenswrapper[4708]: I0320 16:27:15.832509 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_103fe6f4-2ac5-430b-9ce4-2d142b273674/setup-container/0.log" Mar 20 16:27:15 crc kubenswrapper[4708]: I0320 16:27:15.891599 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-sync-7q7vc_3ded2837-c536-490b-a13c-2a09ea07a7aa/placement-db-sync/0.log" Mar 20 16:27:16 crc kubenswrapper[4708]: I0320 16:27:16.339838 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_103fe6f4-2ac5-430b-9ce4-2d142b273674/setup-container/0.log" Mar 20 16:27:16 crc kubenswrapper[4708]: I0320 16:27:16.349363 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_103fe6f4-2ac5-430b-9ce4-2d142b273674/rabbitmq/0.log" Mar 20 16:27:16 crc kubenswrapper[4708]: I0320 16:27:16.354473 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1da957bf-1f80-4bef-9033-333fa60118c3/setup-container/0.log" Mar 20 16:27:16 crc kubenswrapper[4708]: I0320 16:27:16.550377 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1da957bf-1f80-4bef-9033-333fa60118c3/setup-container/0.log" Mar 20 16:27:16 crc kubenswrapper[4708]: I0320 16:27:16.578501 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1da957bf-1f80-4bef-9033-333fa60118c3/rabbitmq/0.log" Mar 20 16:27:16 crc kubenswrapper[4708]: I0320 16:27:16.603389 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-bw6zk_c397eb69-e70b-4a8b-8c3f-162c06ccc6bc/mariadb-account-create-update/0.log" Mar 20 16:27:16 crc kubenswrapper[4708]: I0320 16:27:16.808321 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f48f97b7c-qw6zw_519360dd-4258-4c54-a239-55283b46ffb3/proxy-httpd/0.log" Mar 20 16:27:16 crc kubenswrapper[4708]: I0320 16:27:16.840687 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f48f97b7c-qw6zw_519360dd-4258-4c54-a239-55283b46ffb3/proxy-server/0.log" Mar 20 16:27:16 crc kubenswrapper[4708]: I0320 16:27:16.991337 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-s6247_ad381906-1b90-4230-a435-9ed844232ba1/swift-ring-rebalance/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.091593 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/account-auditor/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.160266 4708 scope.go:117] "RemoveContainer" containerID="ea3a32bd99e17c4a243bceeff47881485473daf06ed79d00bb8cf2e91c78cffa" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.172368 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/account-replicator/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.186140 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/account-reaper/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.208871 4708 scope.go:117] "RemoveContainer" containerID="05b518bfc1c6f70dc0452b73d35a9222f4c8b3d89de1a070a700bb1630575c98" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.258092 4708 scope.go:117] "RemoveContainer" containerID="d0f141d43349a6b8e3ce00c7e3d44db4e425e7d8f794c14e2a3919f139885855" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.287023 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/account-server/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.359878 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/container-auditor/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.496637 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/container-updater/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.507762 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/container-server/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.510768 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/container-replicator/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.570831 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/object-auditor/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.604246 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d80d56a8-3037-4d99-afd9-61aeecc4259c/memcached/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.739453 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/object-expirer/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.739514 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/object-server/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.751836 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/object-replicator/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.799075 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/rsync/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.810570 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/object-updater/0.log" Mar 20 16:27:17 crc kubenswrapper[4708]: I0320 16:27:17.912865 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b5b88259-a142-42a0-ab2c-bb0980ad9465/swift-recon-cron/0.log" Mar 20 16:27:19 crc kubenswrapper[4708]: I0320 16:27:19.160634 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:19 crc kubenswrapper[4708]: I0320 16:27:19.161835 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:19 crc kubenswrapper[4708]: I0320 16:27:19.214711 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:19 crc kubenswrapper[4708]: I0320 16:27:19.270185 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:19 crc kubenswrapper[4708]: I0320 16:27:19.454748 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-69wwd"] Mar 20 16:27:21 crc kubenswrapper[4708]: I0320 16:27:21.192481 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-69wwd" podUID="fc2c46d0-f762-4645-bbdb-093d7622c5fe" containerName="registry-server" containerID="cri-o://20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b" gracePeriod=2 Mar 20 16:27:21 crc kubenswrapper[4708]: I0320 16:27:21.746410 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:21 crc kubenswrapper[4708]: I0320 16:27:21.826633 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2c46d0-f762-4645-bbdb-093d7622c5fe-utilities\") pod \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\" (UID: \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\") " Mar 20 16:27:21 crc kubenswrapper[4708]: I0320 16:27:21.826775 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2c46d0-f762-4645-bbdb-093d7622c5fe-catalog-content\") pod \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\" (UID: \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\") " Mar 20 16:27:21 crc kubenswrapper[4708]: I0320 16:27:21.826933 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh7tk\" (UniqueName: \"kubernetes.io/projected/fc2c46d0-f762-4645-bbdb-093d7622c5fe-kube-api-access-jh7tk\") pod \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\" (UID: \"fc2c46d0-f762-4645-bbdb-093d7622c5fe\") " Mar 20 16:27:21 crc kubenswrapper[4708]: I0320 16:27:21.828443 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2c46d0-f762-4645-bbdb-093d7622c5fe-utilities" (OuterVolumeSpecName: "utilities") pod "fc2c46d0-f762-4645-bbdb-093d7622c5fe" (UID: "fc2c46d0-f762-4645-bbdb-093d7622c5fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:27:21 crc kubenswrapper[4708]: I0320 16:27:21.837081 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2c46d0-f762-4645-bbdb-093d7622c5fe-kube-api-access-jh7tk" (OuterVolumeSpecName: "kube-api-access-jh7tk") pod "fc2c46d0-f762-4645-bbdb-093d7622c5fe" (UID: "fc2c46d0-f762-4645-bbdb-093d7622c5fe"). InnerVolumeSpecName "kube-api-access-jh7tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:27:21 crc kubenswrapper[4708]: I0320 16:27:21.929177 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh7tk\" (UniqueName: \"kubernetes.io/projected/fc2c46d0-f762-4645-bbdb-093d7622c5fe-kube-api-access-jh7tk\") on node \"crc\" DevicePath \"\"" Mar 20 16:27:21 crc kubenswrapper[4708]: I0320 16:27:21.929211 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2c46d0-f762-4645-bbdb-093d7622c5fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.204035 4708 generic.go:334] "Generic (PLEG): container finished" podID="fc2c46d0-f762-4645-bbdb-093d7622c5fe" containerID="20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b" exitCode=0 Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.204300 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69wwd" event={"ID":"fc2c46d0-f762-4645-bbdb-093d7622c5fe","Type":"ContainerDied","Data":"20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b"} Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.204381 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69wwd" event={"ID":"fc2c46d0-f762-4645-bbdb-093d7622c5fe","Type":"ContainerDied","Data":"ae2e55c7b1a9a7967004a1385ca53cd1fb24be1984e4238704b7545f09a832b4"} Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.204405 4708 scope.go:117] "RemoveContainer" containerID="20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.204413 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69wwd" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.226093 4708 scope.go:117] "RemoveContainer" containerID="a99cbd70e419767b5163971162c98ab61b5a308fae039dd1b75cfbb9e5696ad1" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.254730 4708 scope.go:117] "RemoveContainer" containerID="605999be8876beff295beb8b07d5830897ca334eafa339629e711a00f56cdaee" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.300131 4708 scope.go:117] "RemoveContainer" containerID="20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b" Mar 20 16:27:22 crc kubenswrapper[4708]: E0320 16:27:22.300884 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b\": container with ID starting with 20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b not found: ID does not exist" containerID="20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.300930 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b"} err="failed to get container status \"20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b\": rpc error: code = NotFound desc = could not find container \"20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b\": container with ID starting with 20b83ad3608ad31607d1dc36a43950ceaa10aec4f522a15efe02c588ac91c32b not found: ID does not exist" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.300959 4708 scope.go:117] "RemoveContainer" containerID="a99cbd70e419767b5163971162c98ab61b5a308fae039dd1b75cfbb9e5696ad1" Mar 20 16:27:22 crc kubenswrapper[4708]: E0320 16:27:22.301580 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a99cbd70e419767b5163971162c98ab61b5a308fae039dd1b75cfbb9e5696ad1\": container with ID starting with a99cbd70e419767b5163971162c98ab61b5a308fae039dd1b75cfbb9e5696ad1 not found: ID does not exist" containerID="a99cbd70e419767b5163971162c98ab61b5a308fae039dd1b75cfbb9e5696ad1" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.301608 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a99cbd70e419767b5163971162c98ab61b5a308fae039dd1b75cfbb9e5696ad1"} err="failed to get container status \"a99cbd70e419767b5163971162c98ab61b5a308fae039dd1b75cfbb9e5696ad1\": rpc error: code = NotFound desc = could not find container \"a99cbd70e419767b5163971162c98ab61b5a308fae039dd1b75cfbb9e5696ad1\": container with ID starting with a99cbd70e419767b5163971162c98ab61b5a308fae039dd1b75cfbb9e5696ad1 not found: ID does not exist" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.301625 4708 scope.go:117] "RemoveContainer" containerID="605999be8876beff295beb8b07d5830897ca334eafa339629e711a00f56cdaee" Mar 20 16:27:22 crc kubenswrapper[4708]: E0320 16:27:22.301916 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"605999be8876beff295beb8b07d5830897ca334eafa339629e711a00f56cdaee\": container with ID starting with 605999be8876beff295beb8b07d5830897ca334eafa339629e711a00f56cdaee not found: ID does not exist" containerID="605999be8876beff295beb8b07d5830897ca334eafa339629e711a00f56cdaee" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.301944 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605999be8876beff295beb8b07d5830897ca334eafa339629e711a00f56cdaee"} err="failed to get container status \"605999be8876beff295beb8b07d5830897ca334eafa339629e711a00f56cdaee\": rpc error: code = NotFound desc = could not find container \"605999be8876beff295beb8b07d5830897ca334eafa339629e711a00f56cdaee\": container with ID starting with 605999be8876beff295beb8b07d5830897ca334eafa339629e711a00f56cdaee not found: ID does not exist" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.393260 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2c46d0-f762-4645-bbdb-093d7622c5fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc2c46d0-f762-4645-bbdb-093d7622c5fe" (UID: "fc2c46d0-f762-4645-bbdb-093d7622c5fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.439394 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2c46d0-f762-4645-bbdb-093d7622c5fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.538798 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-69wwd"] Mar 20 16:27:22 crc kubenswrapper[4708]: I0320 16:27:22.549347 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-69wwd"] Mar 20 16:27:24 crc kubenswrapper[4708]: I0320 16:27:24.121997 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc2c46d0-f762-4645-bbdb-093d7622c5fe" path="/var/lib/kubelet/pods/fc2c46d0-f762-4645-bbdb-093d7622c5fe/volumes" Mar 20 16:27:26 crc kubenswrapper[4708]: I0320 16:27:26.178433 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:27:26 crc kubenswrapper[4708]: I0320 16:27:26.178739 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:27:26 crc kubenswrapper[4708]: I0320 16:27:26.178795 4708 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" Mar 20 16:27:26 crc kubenswrapper[4708]: I0320 16:27:26.179569 4708 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445"} pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:27:26 crc kubenswrapper[4708]: I0320 16:27:26.179621 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" containerID="cri-o://722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" gracePeriod=600 Mar 20 16:27:26 crc kubenswrapper[4708]: E0320 16:27:26.302633 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:27:27 crc kubenswrapper[4708]: I0320 16:27:27.250725 4708 generic.go:334] "Generic (PLEG): container finished" podID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" exitCode=0 Mar 20 16:27:27 crc kubenswrapper[4708]: I0320 16:27:27.250808 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerDied","Data":"722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445"} Mar 20 16:27:27 crc kubenswrapper[4708]: I0320 16:27:27.251072 4708 scope.go:117] "RemoveContainer" containerID="1802f9b5863d50f9b0059b425a0ca397c3796b016f71b5db8c43776fd2853ecb" Mar 20 16:27:27 crc kubenswrapper[4708]: I0320 16:27:27.251729 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:27:27 crc kubenswrapper[4708]: E0320 16:27:27.251984 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:27:38 crc kubenswrapper[4708]: I0320 16:27:38.111484 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:27:38 crc kubenswrapper[4708]: E0320 16:27:38.112425 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:27:39 crc kubenswrapper[4708]: I0320 16:27:39.408048 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-xr46h_c39a1357-1bbd-4bad-b18d-db9e7eacad56/manager/0.log" Mar 20 16:27:39 crc kubenswrapper[4708]: I0320 16:27:39.690398 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn_aeecd507-ad55-4ceb-994a-1431f6d686c6/util/0.log" Mar 20 16:27:39 crc kubenswrapper[4708]: I0320 16:27:39.883256 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn_aeecd507-ad55-4ceb-994a-1431f6d686c6/pull/0.log" Mar 20 16:27:39 crc kubenswrapper[4708]: I0320 16:27:39.899021 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn_aeecd507-ad55-4ceb-994a-1431f6d686c6/util/0.log" Mar 20 16:27:39 crc kubenswrapper[4708]: I0320 16:27:39.944061 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn_aeecd507-ad55-4ceb-994a-1431f6d686c6/pull/0.log" Mar 20 16:27:40 crc kubenswrapper[4708]: I0320 16:27:40.091792 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn_aeecd507-ad55-4ceb-994a-1431f6d686c6/util/0.log" Mar 20 16:27:40 crc kubenswrapper[4708]: I0320 16:27:40.171958 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn_aeecd507-ad55-4ceb-994a-1431f6d686c6/extract/0.log" Mar 20 16:27:40 crc kubenswrapper[4708]: I0320 16:27:40.182106 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9020db377e0f762bdb3f85da6af83fb47bc1f6c732afab5c32308bd74tspvn_aeecd507-ad55-4ceb-994a-1431f6d686c6/pull/0.log" Mar 20 16:27:40 crc kubenswrapper[4708]: I0320 16:27:40.365136 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-sl77r_d19b5a0c-7421-4454-9c5a-d2bf4828901a/manager/0.log" Mar 20 16:27:40 crc kubenswrapper[4708]: I0320 16:27:40.430721 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-fznb9_19a022dc-93c1-4899-837f-4fd78793d13d/manager/0.log" Mar 20 16:27:40 crc kubenswrapper[4708]: I0320 16:27:40.541881 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-nlvdk_0a721b9a-b944-45de-8eed-3f181e09f6cf/manager/0.log" Mar 20 16:27:40 crc kubenswrapper[4708]: I0320 16:27:40.630279 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-ttl8m_04a6f0a4-ca27-4a23-986b-db16468db950/manager/0.log" Mar 20 16:27:40 crc kubenswrapper[4708]: I0320 16:27:40.744508 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-zxbx5_2247a6d3-610d-4ca0-bfe6-711bb14a22cf/manager/0.log" Mar 20 16:27:40 crc kubenswrapper[4708]: I0320 16:27:40.949728 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-kckv6_01ac874a-addd-4513-94c5-cd4b71fa6eb5/manager/0.log" Mar 20 16:27:41 crc kubenswrapper[4708]: I0320 16:27:41.017787 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-8d4c8954d-tqkbf_355388b4-930a-4dfa-bb0e-ba8b71e9e40d/manager/0.log" Mar 20 16:27:41 crc kubenswrapper[4708]: I0320 16:27:41.085220 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-rxb8r_5b270fa1-96a9-4edd-b984-34900c29cc1b/manager/0.log" Mar 20 16:27:41 crc kubenswrapper[4708]: I0320 16:27:41.227578 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-52ngd_efd0041f-ae6d-4c76-a7b5-092d353a029e/manager/0.log" Mar 20 16:27:41 crc kubenswrapper[4708]: I0320 16:27:41.289661 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-5vc22_1f45c62f-cffa-416f-bead-eace9256fa45/manager/0.log" Mar 20 16:27:41 crc kubenswrapper[4708]: I0320 16:27:41.490460 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-nthm4_fcf8691e-a944-41ca-8ee3-23280572780e/manager/0.log" Mar 20 16:27:41 crc kubenswrapper[4708]: I0320 16:27:41.576660 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-z8nqm_592c8a29-a52a-4b53-86b0-318dbce9424d/manager/0.log" Mar 20 16:27:41 crc kubenswrapper[4708]: I0320 16:27:41.699192 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-55b97_09c50326-bc3c-423d-a25a-39dc67a90efd/manager/0.log" Mar 20 16:27:41 crc kubenswrapper[4708]: I0320 16:27:41.771806 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f5dzcth_26cc5de3-9cef-462a-9e28-ac485ed04178/manager/0.log" Mar 20 16:27:42 crc kubenswrapper[4708]: I0320 16:27:42.062002 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-94465cd74-j22xf_5c3f787c-d4c6-49c8-99a2-c9f3ae02738a/operator/0.log" Mar 20 16:27:42 crc kubenswrapper[4708]: I0320 16:27:42.200152 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-m8lg6_be2b9d3f-c01f-4649-9b51-2068e6f541a8/registry-server/0.log" Mar 20 16:27:42 crc kubenswrapper[4708]: I0320 16:27:42.364492 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-hz97n_8589593b-d4f1-4492-8fdf-533db88caa80/manager/0.log" Mar 20 16:27:42 crc kubenswrapper[4708]: I0320 16:27:42.459816 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-pcdrn_4623ed97-f89b-4be1-9d67-e1a5aecf325d/manager/0.log" Mar 20 16:27:42 crc kubenswrapper[4708]: I0320 16:27:42.733923 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-2lpl8_e1d5280f-8680-4272-bf5f-48c2239f731e/manager/0.log" Mar 20 16:27:42 crc kubenswrapper[4708]: I0320 16:27:42.752125 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b5b55fc46-x92n5_a0b76f4f-b36a-4979-8110-db30012a6291/manager/0.log" Mar 20 16:27:42 crc kubenswrapper[4708]: I0320 16:27:42.924366 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-4v48w_ac41d7c1-6dc3-49f7-8623-1294368145af/manager/0.log" Mar 20 16:27:42 crc kubenswrapper[4708]: I0320 16:27:42.948444 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-rpx5s_9f93fb6f-7cd1-4022-a246-410d30457921/manager/0.log" Mar 20 16:27:43 crc kubenswrapper[4708]: I0320 16:27:43.118818 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-thrst_942f7804-64bd-4d04-badb-9cc592387608/manager/0.log" Mar 20 16:27:51 crc kubenswrapper[4708]: I0320 16:27:51.111142 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:27:51 crc kubenswrapper[4708]: E0320 16:27:51.111994 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.170987 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567068-tv98j"] Mar 20 16:28:00 crc kubenswrapper[4708]: E0320 16:28:00.172919 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2c46d0-f762-4645-bbdb-093d7622c5fe" containerName="extract-utilities" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.172949 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2c46d0-f762-4645-bbdb-093d7622c5fe" containerName="extract-utilities" Mar 20 16:28:00 crc kubenswrapper[4708]: E0320 16:28:00.172970 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2c46d0-f762-4645-bbdb-093d7622c5fe" containerName="extract-content" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.172982 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2c46d0-f762-4645-bbdb-093d7622c5fe" containerName="extract-content" Mar 20 16:28:00 crc kubenswrapper[4708]: E0320 16:28:00.173024 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2c46d0-f762-4645-bbdb-093d7622c5fe" containerName="registry-server" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.173036 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2c46d0-f762-4645-bbdb-093d7622c5fe" containerName="registry-server" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.173356 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc2c46d0-f762-4645-bbdb-093d7622c5fe" containerName="registry-server" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.174482 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-tv98j" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.177985 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.180626 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.181979 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-tv98j"] Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.185270 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.303375 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb8xn\" (UniqueName: \"kubernetes.io/projected/b717bca9-34e7-41fb-b9ac-d65c674a3d22-kube-api-access-qb8xn\") pod \"auto-csr-approver-29567068-tv98j\" (UID: \"b717bca9-34e7-41fb-b9ac-d65c674a3d22\") " pod="openshift-infra/auto-csr-approver-29567068-tv98j" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.405642 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb8xn\" (UniqueName: \"kubernetes.io/projected/b717bca9-34e7-41fb-b9ac-d65c674a3d22-kube-api-access-qb8xn\") pod \"auto-csr-approver-29567068-tv98j\" (UID: \"b717bca9-34e7-41fb-b9ac-d65c674a3d22\") " pod="openshift-infra/auto-csr-approver-29567068-tv98j" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.437891 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb8xn\" (UniqueName: \"kubernetes.io/projected/b717bca9-34e7-41fb-b9ac-d65c674a3d22-kube-api-access-qb8xn\") pod \"auto-csr-approver-29567068-tv98j\" (UID: \"b717bca9-34e7-41fb-b9ac-d65c674a3d22\") " pod="openshift-infra/auto-csr-approver-29567068-tv98j" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.500316 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-tv98j" Mar 20 16:28:00 crc kubenswrapper[4708]: I0320 16:28:00.996836 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-tv98j"] Mar 20 16:28:01 crc kubenswrapper[4708]: I0320 16:28:01.599371 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-tv98j" event={"ID":"b717bca9-34e7-41fb-b9ac-d65c674a3d22","Type":"ContainerStarted","Data":"9bc504b45c4aad594a5b8a61c8e6a25f2ead7ec8079882e6f601d1d6dce8ff81"} Mar 20 16:28:02 crc kubenswrapper[4708]: I0320 16:28:02.802653 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qd9xk_aacba3e9-9c82-4ee9-9e4f-d37bd7e2ced6/control-plane-machine-set-operator/0.log" Mar 20 16:28:02 crc kubenswrapper[4708]: I0320 16:28:02.891531 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-q9z2q_716e1008-4ee5-42c3-9b4a-5c85a53489e0/kube-rbac-proxy/0.log" Mar 20 16:28:02 crc kubenswrapper[4708]: I0320 16:28:02.981579 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-q9z2q_716e1008-4ee5-42c3-9b4a-5c85a53489e0/machine-api-operator/0.log" Mar 20 16:28:03 crc kubenswrapper[4708]: I0320 16:28:03.112587 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:28:03 crc kubenswrapper[4708]: E0320 16:28:03.112815 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:28:03 crc kubenswrapper[4708]: I0320 16:28:03.637473 4708 generic.go:334] "Generic (PLEG): container finished" podID="b717bca9-34e7-41fb-b9ac-d65c674a3d22" containerID="27965af84deaa4be5455820c418d957a265e1e56d40386b38b3c24d1d1e96aef" exitCode=0 Mar 20 16:28:03 crc kubenswrapper[4708]: I0320 16:28:03.637522 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-tv98j" event={"ID":"b717bca9-34e7-41fb-b9ac-d65c674a3d22","Type":"ContainerDied","Data":"27965af84deaa4be5455820c418d957a265e1e56d40386b38b3c24d1d1e96aef"} Mar 20 16:28:05 crc kubenswrapper[4708]: I0320 16:28:05.013301 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-tv98j" Mar 20 16:28:05 crc kubenswrapper[4708]: I0320 16:28:05.194295 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb8xn\" (UniqueName: \"kubernetes.io/projected/b717bca9-34e7-41fb-b9ac-d65c674a3d22-kube-api-access-qb8xn\") pod \"b717bca9-34e7-41fb-b9ac-d65c674a3d22\" (UID: \"b717bca9-34e7-41fb-b9ac-d65c674a3d22\") " Mar 20 16:28:05 crc kubenswrapper[4708]: I0320 16:28:05.211962 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b717bca9-34e7-41fb-b9ac-d65c674a3d22-kube-api-access-qb8xn" (OuterVolumeSpecName: "kube-api-access-qb8xn") pod "b717bca9-34e7-41fb-b9ac-d65c674a3d22" (UID: "b717bca9-34e7-41fb-b9ac-d65c674a3d22"). InnerVolumeSpecName "kube-api-access-qb8xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:28:05 crc kubenswrapper[4708]: I0320 16:28:05.298241 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb8xn\" (UniqueName: \"kubernetes.io/projected/b717bca9-34e7-41fb-b9ac-d65c674a3d22-kube-api-access-qb8xn\") on node \"crc\" DevicePath \"\"" Mar 20 16:28:05 crc kubenswrapper[4708]: I0320 16:28:05.657258 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-tv98j" event={"ID":"b717bca9-34e7-41fb-b9ac-d65c674a3d22","Type":"ContainerDied","Data":"9bc504b45c4aad594a5b8a61c8e6a25f2ead7ec8079882e6f601d1d6dce8ff81"} Mar 20 16:28:05 crc kubenswrapper[4708]: I0320 16:28:05.657308 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bc504b45c4aad594a5b8a61c8e6a25f2ead7ec8079882e6f601d1d6dce8ff81" Mar 20 16:28:05 crc kubenswrapper[4708]: I0320 16:28:05.657373 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-tv98j" Mar 20 16:28:06 crc kubenswrapper[4708]: I0320 16:28:06.094020 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-wp76k"] Mar 20 16:28:06 crc kubenswrapper[4708]: I0320 16:28:06.103331 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-wp76k"] Mar 20 16:28:06 crc kubenswrapper[4708]: I0320 16:28:06.121096 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9260427-7f69-4872-9c37-912e6d1cd594" path="/var/lib/kubelet/pods/c9260427-7f69-4872-9c37-912e6d1cd594/volumes" Mar 20 16:28:14 crc kubenswrapper[4708]: I0320 16:28:14.112273 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:28:14 crc kubenswrapper[4708]: E0320 16:28:14.113408 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:28:14 crc kubenswrapper[4708]: I0320 16:28:14.864829 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-jdvxw_720cff92-f259-40b0-a7cb-efa0a67b8ff4/cert-manager-controller/0.log" Mar 20 16:28:15 crc kubenswrapper[4708]: I0320 16:28:15.048526 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-w7lk6_49e88ccc-7491-4ca3-8f14-4e5a82093d0b/cert-manager-cainjector/0.log" Mar 20 16:28:15 crc kubenswrapper[4708]: I0320 16:28:15.071296 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-tgmps_fa91730d-98e2-4cdf-a110-b3f8a4a95731/cert-manager-webhook/0.log" Mar 20 16:28:17 crc kubenswrapper[4708]: I0320 16:28:17.411111 4708 scope.go:117] "RemoveContainer" containerID="034181e78dd49dbec62362055d85c31d6fdf4fd68c8f308d4580c8de5b3e1f9d" Mar 20 16:28:17 crc kubenswrapper[4708]: I0320 16:28:17.450750 4708 scope.go:117] "RemoveContainer" containerID="13f8f5fe5a6eca2dbd0a42822bc6e8579ee26b3846c842eaa48eed01f89e9ec5" Mar 20 16:28:17 crc kubenswrapper[4708]: I0320 16:28:17.482570 4708 scope.go:117] "RemoveContainer" containerID="380c1ad3e5e7a07edf393f0538a366f39f35f8414874b9535bc2a3c535d831cf" Mar 20 16:28:17 crc kubenswrapper[4708]: I0320 16:28:17.533468 4708 scope.go:117] "RemoveContainer" containerID="4ba8669b02f07a77d6f20179a54d39ad8e036c49843f75706ab61fd4849e1aec" Mar 20 16:28:17 crc kubenswrapper[4708]: I0320 16:28:17.554827 4708 scope.go:117] "RemoveContainer" containerID="b944e61c5c16abc622e9a2e425e1b456cbf6c849ea7f1ff203f30fa8c1fcecd3" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.117851 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:28:26 crc kubenswrapper[4708]: E0320 16:28:26.118588 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.495628 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-srj7s"] Mar 20 16:28:26 crc kubenswrapper[4708]: E0320 16:28:26.496144 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b717bca9-34e7-41fb-b9ac-d65c674a3d22" containerName="oc" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.496167 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b717bca9-34e7-41fb-b9ac-d65c674a3d22" containerName="oc" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.496416 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="b717bca9-34e7-41fb-b9ac-d65c674a3d22" containerName="oc" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.498237 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.508549 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-srj7s"] Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.538681 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ba730e6-21c7-4527-8314-85323c01ad68-catalog-content\") pod \"certified-operators-srj7s\" (UID: \"2ba730e6-21c7-4527-8314-85323c01ad68\") " pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.539063 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ba730e6-21c7-4527-8314-85323c01ad68-utilities\") pod \"certified-operators-srj7s\" (UID: \"2ba730e6-21c7-4527-8314-85323c01ad68\") " pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.539086 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7vqz\" (UniqueName: \"kubernetes.io/projected/2ba730e6-21c7-4527-8314-85323c01ad68-kube-api-access-q7vqz\") pod \"certified-operators-srj7s\" (UID: \"2ba730e6-21c7-4527-8314-85323c01ad68\") " pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.640490 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ba730e6-21c7-4527-8314-85323c01ad68-utilities\") pod \"certified-operators-srj7s\" (UID: \"2ba730e6-21c7-4527-8314-85323c01ad68\") " pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.640537 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7vqz\" (UniqueName: \"kubernetes.io/projected/2ba730e6-21c7-4527-8314-85323c01ad68-kube-api-access-q7vqz\") pod \"certified-operators-srj7s\" (UID: \"2ba730e6-21c7-4527-8314-85323c01ad68\") " pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.640572 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ba730e6-21c7-4527-8314-85323c01ad68-catalog-content\") pod \"certified-operators-srj7s\" (UID: \"2ba730e6-21c7-4527-8314-85323c01ad68\") " pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.641077 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ba730e6-21c7-4527-8314-85323c01ad68-utilities\") pod \"certified-operators-srj7s\" (UID: \"2ba730e6-21c7-4527-8314-85323c01ad68\") " pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.641293 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ba730e6-21c7-4527-8314-85323c01ad68-catalog-content\") pod \"certified-operators-srj7s\" (UID: \"2ba730e6-21c7-4527-8314-85323c01ad68\") " pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.667715 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7vqz\" (UniqueName: \"kubernetes.io/projected/2ba730e6-21c7-4527-8314-85323c01ad68-kube-api-access-q7vqz\") pod \"certified-operators-srj7s\" (UID: \"2ba730e6-21c7-4527-8314-85323c01ad68\") " pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:26 crc kubenswrapper[4708]: I0320 16:28:26.822171 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:27 crc kubenswrapper[4708]: I0320 16:28:27.371248 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-srj7s"] Mar 20 16:28:27 crc kubenswrapper[4708]: I0320 16:28:27.862211 4708 generic.go:334] "Generic (PLEG): container finished" podID="2ba730e6-21c7-4527-8314-85323c01ad68" containerID="4e4b50fb9c50fce3b953737929cd8f455571a12ba1856485e2dc913ec5768732" exitCode=0 Mar 20 16:28:27 crc kubenswrapper[4708]: I0320 16:28:27.862267 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srj7s" event={"ID":"2ba730e6-21c7-4527-8314-85323c01ad68","Type":"ContainerDied","Data":"4e4b50fb9c50fce3b953737929cd8f455571a12ba1856485e2dc913ec5768732"} Mar 20 16:28:27 crc kubenswrapper[4708]: I0320 16:28:27.862522 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srj7s" event={"ID":"2ba730e6-21c7-4527-8314-85323c01ad68","Type":"ContainerStarted","Data":"6578a85c7e375173cdf9109f4b304c627e8ef309f9ef819b50d8d6627b6daec5"} Mar 20 16:28:28 crc kubenswrapper[4708]: I0320 16:28:28.229563 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-thkld_b64a32bd-b9f9-434c-98fa-7e178997d1f4/nmstate-console-plugin/0.log" Mar 20 16:28:28 crc kubenswrapper[4708]: I0320 16:28:28.418878 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-fv5f9_ecfb4ac0-f430-4e1b-ba99-1850c11ba37f/kube-rbac-proxy/0.log" Mar 20 16:28:28 crc kubenswrapper[4708]: I0320 16:28:28.452994 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qc6v2_aca1c7f0-410b-4e89-80a8-60f6005cee50/nmstate-handler/0.log" Mar 20 16:28:28 crc kubenswrapper[4708]: I0320 16:28:28.457254 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-fv5f9_ecfb4ac0-f430-4e1b-ba99-1850c11ba37f/nmstate-metrics/0.log" Mar 20 16:28:28 crc kubenswrapper[4708]: I0320 16:28:28.676065 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-pqzgj_25642ab2-e76a-4bd3-83cf-24c5cc896ff5/nmstate-operator/0.log" Mar 20 16:28:28 crc kubenswrapper[4708]: I0320 16:28:28.689351 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-v6ws2_a8616b74-fe5d-49c2-9a69-a2448ae072b2/nmstate-webhook/0.log" Mar 20 16:28:28 crc kubenswrapper[4708]: I0320 16:28:28.880392 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srj7s" event={"ID":"2ba730e6-21c7-4527-8314-85323c01ad68","Type":"ContainerStarted","Data":"826a4817edce9efd2a81f9cf681aa91f117418715e28bb1022f232c94b75d7e4"} Mar 20 16:28:29 crc kubenswrapper[4708]: I0320 16:28:29.891812 4708 generic.go:334] "Generic (PLEG): container finished" podID="2ba730e6-21c7-4527-8314-85323c01ad68" containerID="826a4817edce9efd2a81f9cf681aa91f117418715e28bb1022f232c94b75d7e4" exitCode=0 Mar 20 16:28:29 crc kubenswrapper[4708]: I0320 16:28:29.891889 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srj7s" event={"ID":"2ba730e6-21c7-4527-8314-85323c01ad68","Type":"ContainerDied","Data":"826a4817edce9efd2a81f9cf681aa91f117418715e28bb1022f232c94b75d7e4"} Mar 20 16:28:30 crc kubenswrapper[4708]: I0320 16:28:30.903960 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srj7s" event={"ID":"2ba730e6-21c7-4527-8314-85323c01ad68","Type":"ContainerStarted","Data":"8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1"} Mar 20 16:28:36 crc kubenswrapper[4708]: I0320 16:28:36.822727 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:36 crc kubenswrapper[4708]: I0320 16:28:36.823217 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:36 crc kubenswrapper[4708]: I0320 16:28:36.900611 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:36 crc kubenswrapper[4708]: I0320 16:28:36.929732 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-srj7s" podStartSLOduration=8.294812374 podStartE2EDuration="10.929715517s" podCreationTimestamp="2026-03-20 16:28:26 +0000 UTC" firstStartedPulling="2026-03-20 16:28:27.8637274 +0000 UTC m=+1662.538064115" lastFinishedPulling="2026-03-20 16:28:30.498630543 +0000 UTC m=+1665.172967258" observedRunningTime="2026-03-20 16:28:30.924084799 +0000 UTC m=+1665.598421534" watchObservedRunningTime="2026-03-20 16:28:36.929715517 +0000 UTC m=+1671.604052232" Mar 20 16:28:37 crc kubenswrapper[4708]: I0320 16:28:37.047833 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:37 crc kubenswrapper[4708]: I0320 16:28:37.147230 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-srj7s"] Mar 20 16:28:38 crc kubenswrapper[4708]: I0320 16:28:38.982573 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-srj7s" podUID="2ba730e6-21c7-4527-8314-85323c01ad68" containerName="registry-server" containerID="cri-o://8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1" gracePeriod=2 Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.650038 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.794495 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ba730e6-21c7-4527-8314-85323c01ad68-utilities\") pod \"2ba730e6-21c7-4527-8314-85323c01ad68\" (UID: \"2ba730e6-21c7-4527-8314-85323c01ad68\") " Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.794609 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ba730e6-21c7-4527-8314-85323c01ad68-catalog-content\") pod \"2ba730e6-21c7-4527-8314-85323c01ad68\" (UID: \"2ba730e6-21c7-4527-8314-85323c01ad68\") " Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.795652 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ba730e6-21c7-4527-8314-85323c01ad68-utilities" (OuterVolumeSpecName: "utilities") pod "2ba730e6-21c7-4527-8314-85323c01ad68" (UID: "2ba730e6-21c7-4527-8314-85323c01ad68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.801903 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7vqz\" (UniqueName: \"kubernetes.io/projected/2ba730e6-21c7-4527-8314-85323c01ad68-kube-api-access-q7vqz\") pod \"2ba730e6-21c7-4527-8314-85323c01ad68\" (UID: \"2ba730e6-21c7-4527-8314-85323c01ad68\") " Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.802774 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ba730e6-21c7-4527-8314-85323c01ad68-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.807350 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ba730e6-21c7-4527-8314-85323c01ad68-kube-api-access-q7vqz" (OuterVolumeSpecName: "kube-api-access-q7vqz") pod "2ba730e6-21c7-4527-8314-85323c01ad68" (UID: "2ba730e6-21c7-4527-8314-85323c01ad68"). InnerVolumeSpecName "kube-api-access-q7vqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.905090 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7vqz\" (UniqueName: \"kubernetes.io/projected/2ba730e6-21c7-4527-8314-85323c01ad68-kube-api-access-q7vqz\") on node \"crc\" DevicePath \"\"" Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.992815 4708 generic.go:334] "Generic (PLEG): container finished" podID="2ba730e6-21c7-4527-8314-85323c01ad68" containerID="8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1" exitCode=0 Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.992874 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srj7s" event={"ID":"2ba730e6-21c7-4527-8314-85323c01ad68","Type":"ContainerDied","Data":"8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1"} Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.992918 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srj7s" Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.992926 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srj7s" event={"ID":"2ba730e6-21c7-4527-8314-85323c01ad68","Type":"ContainerDied","Data":"6578a85c7e375173cdf9109f4b304c627e8ef309f9ef819b50d8d6627b6daec5"} Mar 20 16:28:39 crc kubenswrapper[4708]: I0320 16:28:39.992939 4708 scope.go:117] "RemoveContainer" containerID="8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1" Mar 20 16:28:40 crc kubenswrapper[4708]: I0320 16:28:40.025027 4708 scope.go:117] "RemoveContainer" containerID="826a4817edce9efd2a81f9cf681aa91f117418715e28bb1022f232c94b75d7e4" Mar 20 16:28:40 crc kubenswrapper[4708]: I0320 16:28:40.030038 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ba730e6-21c7-4527-8314-85323c01ad68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ba730e6-21c7-4527-8314-85323c01ad68" (UID: "2ba730e6-21c7-4527-8314-85323c01ad68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:28:40 crc kubenswrapper[4708]: I0320 16:28:40.047824 4708 scope.go:117] "RemoveContainer" containerID="4e4b50fb9c50fce3b953737929cd8f455571a12ba1856485e2dc913ec5768732" Mar 20 16:28:40 crc kubenswrapper[4708]: I0320 16:28:40.108065 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ba730e6-21c7-4527-8314-85323c01ad68-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:28:40 crc kubenswrapper[4708]: I0320 16:28:40.127441 4708 scope.go:117] "RemoveContainer" containerID="8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1" Mar 20 16:28:40 crc kubenswrapper[4708]: E0320 16:28:40.129285 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1\": container with ID starting with 8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1 not found: ID does not exist" containerID="8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1" Mar 20 16:28:40 crc kubenswrapper[4708]: I0320 16:28:40.129327 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1"} err="failed to get container status \"8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1\": rpc error: code = NotFound desc = could not find container \"8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1\": container with ID starting with 8561376d5640fe3af6a79538ce0f9fdd51b139d1516188fc9334f3475c9615c1 not found: ID does not exist" Mar 20 16:28:40 crc kubenswrapper[4708]: I0320 16:28:40.129352 4708 scope.go:117] "RemoveContainer" containerID="826a4817edce9efd2a81f9cf681aa91f117418715e28bb1022f232c94b75d7e4" Mar 20 16:28:40 crc kubenswrapper[4708]: E0320 16:28:40.131413 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"826a4817edce9efd2a81f9cf681aa91f117418715e28bb1022f232c94b75d7e4\": container with ID starting with 826a4817edce9efd2a81f9cf681aa91f117418715e28bb1022f232c94b75d7e4 not found: ID does not exist" containerID="826a4817edce9efd2a81f9cf681aa91f117418715e28bb1022f232c94b75d7e4" Mar 20 16:28:40 crc kubenswrapper[4708]: I0320 16:28:40.131480 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"826a4817edce9efd2a81f9cf681aa91f117418715e28bb1022f232c94b75d7e4"} err="failed to get container status \"826a4817edce9efd2a81f9cf681aa91f117418715e28bb1022f232c94b75d7e4\": rpc error: code = NotFound desc = could not find container \"826a4817edce9efd2a81f9cf681aa91f117418715e28bb1022f232c94b75d7e4\": container with ID starting with 826a4817edce9efd2a81f9cf681aa91f117418715e28bb1022f232c94b75d7e4 not found: ID does not exist" Mar 20 16:28:40 crc kubenswrapper[4708]: I0320 16:28:40.131516 4708 scope.go:117] "RemoveContainer" containerID="4e4b50fb9c50fce3b953737929cd8f455571a12ba1856485e2dc913ec5768732" Mar 20 16:28:40 crc kubenswrapper[4708]: E0320 16:28:40.134122 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e4b50fb9c50fce3b953737929cd8f455571a12ba1856485e2dc913ec5768732\": container with ID starting with 4e4b50fb9c50fce3b953737929cd8f455571a12ba1856485e2dc913ec5768732 not found: ID does not exist" containerID="4e4b50fb9c50fce3b953737929cd8f455571a12ba1856485e2dc913ec5768732" Mar 20 16:28:40 crc kubenswrapper[4708]: I0320 16:28:40.134178 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4b50fb9c50fce3b953737929cd8f455571a12ba1856485e2dc913ec5768732"} err="failed to get container status \"4e4b50fb9c50fce3b953737929cd8f455571a12ba1856485e2dc913ec5768732\": rpc error: code = NotFound desc = could not find container \"4e4b50fb9c50fce3b953737929cd8f455571a12ba1856485e2dc913ec5768732\": container with ID starting with 4e4b50fb9c50fce3b953737929cd8f455571a12ba1856485e2dc913ec5768732 not found: ID does not exist" Mar 20 16:28:40 crc kubenswrapper[4708]: I0320 16:28:40.322606 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-srj7s"] Mar 20 16:28:40 crc kubenswrapper[4708]: I0320 16:28:40.332330 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-srj7s"] Mar 20 16:28:41 crc kubenswrapper[4708]: I0320 16:28:41.111739 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:28:41 crc kubenswrapper[4708]: E0320 16:28:41.112070 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:28:42 crc kubenswrapper[4708]: I0320 16:28:42.120503 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ba730e6-21c7-4527-8314-85323c01ad68" path="/var/lib/kubelet/pods/2ba730e6-21c7-4527-8314-85323c01ad68/volumes" Mar 20 16:28:52 crc kubenswrapper[4708]: I0320 16:28:52.111600 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:28:52 crc kubenswrapper[4708]: E0320 16:28:52.113472 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:28:55 crc kubenswrapper[4708]: I0320 16:28:55.393927 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-mqtxd_ad48fc57-2a20-4652-bf02-e0250154f5a2/kube-rbac-proxy/0.log" Mar 20 16:28:55 crc kubenswrapper[4708]: I0320 16:28:55.510458 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-mqtxd_ad48fc57-2a20-4652-bf02-e0250154f5a2/controller/0.log" Mar 20 16:28:55 crc kubenswrapper[4708]: I0320 16:28:55.561393 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/cp-frr-files/0.log" Mar 20 16:28:55 crc kubenswrapper[4708]: I0320 16:28:55.786893 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/cp-metrics/0.log" Mar 20 16:28:55 crc kubenswrapper[4708]: I0320 16:28:55.820879 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/cp-reloader/0.log" Mar 20 16:28:55 crc kubenswrapper[4708]: I0320 16:28:55.826249 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/cp-reloader/0.log" Mar 20 16:28:55 crc kubenswrapper[4708]: I0320 16:28:55.857773 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/cp-frr-files/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.048407 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/cp-reloader/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.080769 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/cp-metrics/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.098005 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/cp-metrics/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.104897 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/cp-frr-files/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.247679 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/cp-metrics/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.255471 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/cp-frr-files/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.271427 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/cp-reloader/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.300540 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/controller/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.484309 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/frr-metrics/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.485046 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/kube-rbac-proxy/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.530295 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/kube-rbac-proxy-frr/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.869210 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/reloader/0.log" Mar 20 16:28:56 crc kubenswrapper[4708]: I0320 16:28:56.920492 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-b542m_fbb237cd-490a-4b0b-9e60-e52df43516af/frr-k8s-webhook-server/0.log" Mar 20 16:28:57 crc kubenswrapper[4708]: I0320 16:28:57.095581 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-82c7x_03f3b7c8-445f-4ef1-9920-fa75d2fcd0be/frr/0.log" Mar 20 16:28:57 crc kubenswrapper[4708]: I0320 16:28:57.165416 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6687cdd9c4-rqtvv_95dd2919-dc89-4679-a48e-873f255af21e/manager/0.log" Mar 20 16:28:57 crc kubenswrapper[4708]: I0320 16:28:57.261788 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7f6b8f677-57vgm_52725a1b-5472-4e3a-8b88-4ed17ed3c44c/webhook-server/0.log" Mar 20 16:28:57 crc kubenswrapper[4708]: I0320 16:28:57.340908 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wddvr_4a141870-780d-49d9-b58a-6cbeed6b9000/kube-rbac-proxy/0.log" Mar 20 16:28:57 crc kubenswrapper[4708]: I0320 16:28:57.688307 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wddvr_4a141870-780d-49d9-b58a-6cbeed6b9000/speaker/0.log" Mar 20 16:29:04 crc kubenswrapper[4708]: I0320 16:29:04.111700 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:29:04 crc kubenswrapper[4708]: E0320 16:29:04.112595 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:29:10 crc kubenswrapper[4708]: I0320 16:29:10.395813 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m_4b39784f-91ed-47c3-a778-3bd4f77ca786/util/0.log" Mar 20 16:29:10 crc kubenswrapper[4708]: I0320 16:29:10.542832 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m_4b39784f-91ed-47c3-a778-3bd4f77ca786/pull/0.log" Mar 20 16:29:10 crc kubenswrapper[4708]: I0320 16:29:10.563811 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m_4b39784f-91ed-47c3-a778-3bd4f77ca786/util/0.log" Mar 20 16:29:10 crc kubenswrapper[4708]: I0320 16:29:10.598017 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m_4b39784f-91ed-47c3-a778-3bd4f77ca786/pull/0.log" Mar 20 16:29:10 crc kubenswrapper[4708]: I0320 16:29:10.784046 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m_4b39784f-91ed-47c3-a778-3bd4f77ca786/util/0.log" Mar 20 16:29:10 crc kubenswrapper[4708]: I0320 16:29:10.788649 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m_4b39784f-91ed-47c3-a778-3bd4f77ca786/pull/0.log" Mar 20 16:29:10 crc kubenswrapper[4708]: I0320 16:29:10.824184 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gx62m_4b39784f-91ed-47c3-a778-3bd4f77ca786/extract/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.017079 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z_199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2/util/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.149609 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z_199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2/util/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.187492 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z_199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2/pull/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.195130 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z_199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2/pull/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.365657 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z_199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2/util/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.406681 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z_199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2/pull/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.461846 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18l46z_199f5923-01d9-4bfe-ac0b-c8c89d9ea0d2/extract/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.542154 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dpwvk_d8e20300-bfe6-466f-ba27-835e4b432705/extract-utilities/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.719944 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dpwvk_d8e20300-bfe6-466f-ba27-835e4b432705/extract-utilities/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.738166 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dpwvk_d8e20300-bfe6-466f-ba27-835e4b432705/extract-content/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.744238 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dpwvk_d8e20300-bfe6-466f-ba27-835e4b432705/extract-content/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.905926 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dpwvk_d8e20300-bfe6-466f-ba27-835e4b432705/extract-utilities/0.log" Mar 20 16:29:11 crc kubenswrapper[4708]: I0320 16:29:11.936693 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dpwvk_d8e20300-bfe6-466f-ba27-835e4b432705/extract-content/0.log" Mar 20 16:29:12 crc kubenswrapper[4708]: I0320 16:29:12.144924 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pc5d_8da9d79b-8b83-4a0b-bd07-dcb278ef137a/extract-utilities/0.log" Mar 20 16:29:12 crc kubenswrapper[4708]: I0320 16:29:12.343410 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pc5d_8da9d79b-8b83-4a0b-bd07-dcb278ef137a/extract-content/0.log" Mar 20 16:29:12 crc kubenswrapper[4708]: I0320 16:29:12.366092 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pc5d_8da9d79b-8b83-4a0b-bd07-dcb278ef137a/extract-content/0.log" Mar 20 16:29:12 crc kubenswrapper[4708]: I0320 16:29:12.376408 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pc5d_8da9d79b-8b83-4a0b-bd07-dcb278ef137a/extract-utilities/0.log" Mar 20 16:29:12 crc kubenswrapper[4708]: I0320 16:29:12.397623 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dpwvk_d8e20300-bfe6-466f-ba27-835e4b432705/registry-server/0.log" Mar 20 16:29:12 crc kubenswrapper[4708]: I0320 16:29:12.611569 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pc5d_8da9d79b-8b83-4a0b-bd07-dcb278ef137a/extract-utilities/0.log" Mar 20 16:29:12 crc kubenswrapper[4708]: I0320 16:29:12.631592 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pc5d_8da9d79b-8b83-4a0b-bd07-dcb278ef137a/extract-content/0.log" Mar 20 16:29:12 crc kubenswrapper[4708]: I0320 16:29:12.847974 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gczfl_8dbd2786-5db6-431f-8d8a-ca115e65df27/marketplace-operator/0.log" Mar 20 16:29:12 crc kubenswrapper[4708]: I0320 16:29:12.961253 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jcp6r_ee83f4f8-cabc-48ce-a0f2-a5c047a43d85/extract-utilities/0.log" Mar 20 16:29:13 crc kubenswrapper[4708]: I0320 16:29:13.022484 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4pc5d_8da9d79b-8b83-4a0b-bd07-dcb278ef137a/registry-server/0.log" Mar 20 16:29:13 crc kubenswrapper[4708]: I0320 16:29:13.181169 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jcp6r_ee83f4f8-cabc-48ce-a0f2-a5c047a43d85/extract-utilities/0.log" Mar 20 16:29:13 crc kubenswrapper[4708]: I0320 16:29:13.207959 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jcp6r_ee83f4f8-cabc-48ce-a0f2-a5c047a43d85/extract-content/0.log" Mar 20 16:29:13 crc kubenswrapper[4708]: I0320 16:29:13.212610 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jcp6r_ee83f4f8-cabc-48ce-a0f2-a5c047a43d85/extract-content/0.log" Mar 20 16:29:13 crc kubenswrapper[4708]: I0320 16:29:13.379819 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jcp6r_ee83f4f8-cabc-48ce-a0f2-a5c047a43d85/extract-utilities/0.log" Mar 20 16:29:13 crc kubenswrapper[4708]: I0320 16:29:13.444930 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jcp6r_ee83f4f8-cabc-48ce-a0f2-a5c047a43d85/extract-content/0.log" Mar 20 16:29:13 crc kubenswrapper[4708]: I0320 16:29:13.477162 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jcp6r_ee83f4f8-cabc-48ce-a0f2-a5c047a43d85/registry-server/0.log" Mar 20 16:29:13 crc kubenswrapper[4708]: I0320 16:29:13.568154 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mckxb_33dad410-3bf5-4f44-a5d8-440d08d47ff3/extract-utilities/0.log" Mar 20 16:29:13 crc kubenswrapper[4708]: I0320 16:29:13.775629 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mckxb_33dad410-3bf5-4f44-a5d8-440d08d47ff3/extract-utilities/0.log" Mar 20 16:29:13 crc kubenswrapper[4708]: I0320 16:29:13.789573 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mckxb_33dad410-3bf5-4f44-a5d8-440d08d47ff3/extract-content/0.log" Mar 20 16:29:13 crc kubenswrapper[4708]: I0320 16:29:13.824326 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mckxb_33dad410-3bf5-4f44-a5d8-440d08d47ff3/extract-content/0.log" Mar 20 16:29:14 crc kubenswrapper[4708]: I0320 16:29:14.159970 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mckxb_33dad410-3bf5-4f44-a5d8-440d08d47ff3/extract-utilities/0.log" Mar 20 16:29:14 crc kubenswrapper[4708]: I0320 16:29:14.311231 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mckxb_33dad410-3bf5-4f44-a5d8-440d08d47ff3/extract-content/0.log" Mar 20 16:29:14 crc kubenswrapper[4708]: I0320 16:29:14.508211 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mckxb_33dad410-3bf5-4f44-a5d8-440d08d47ff3/registry-server/0.log" Mar 20 16:29:19 crc kubenswrapper[4708]: I0320 16:29:19.111500 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:29:19 crc kubenswrapper[4708]: E0320 16:29:19.113752 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:29:34 crc kubenswrapper[4708]: I0320 16:29:34.111654 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:29:34 crc kubenswrapper[4708]: E0320 16:29:34.112487 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:29:45 crc kubenswrapper[4708]: I0320 16:29:45.111526 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:29:45 crc kubenswrapper[4708]: E0320 16:29:45.112130 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:29:57 crc kubenswrapper[4708]: I0320 16:29:57.112060 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:29:57 crc kubenswrapper[4708]: E0320 16:29:57.112817 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.153955 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg"] Mar 20 16:30:00 crc kubenswrapper[4708]: E0320 16:30:00.156139 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ba730e6-21c7-4527-8314-85323c01ad68" containerName="extract-utilities" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.156187 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba730e6-21c7-4527-8314-85323c01ad68" containerName="extract-utilities" Mar 20 16:30:00 crc kubenswrapper[4708]: E0320 16:30:00.156229 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ba730e6-21c7-4527-8314-85323c01ad68" containerName="registry-server" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.156243 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba730e6-21c7-4527-8314-85323c01ad68" containerName="registry-server" Mar 20 16:30:00 crc kubenswrapper[4708]: E0320 16:30:00.156280 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ba730e6-21c7-4527-8314-85323c01ad68" containerName="extract-content" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.156289 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba730e6-21c7-4527-8314-85323c01ad68" containerName="extract-content" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.157697 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ba730e6-21c7-4527-8314-85323c01ad68" containerName="registry-server" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.159428 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.164170 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.164608 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.178412 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrxm4\" (UniqueName: \"kubernetes.io/projected/ee8e1026-b0fd-477d-94f0-9d577115e2ba-kube-api-access-hrxm4\") pod \"collect-profiles-29567070-brdlg\" (UID: \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.178508 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee8e1026-b0fd-477d-94f0-9d577115e2ba-config-volume\") pod \"collect-profiles-29567070-brdlg\" (UID: \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.178664 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee8e1026-b0fd-477d-94f0-9d577115e2ba-secret-volume\") pod \"collect-profiles-29567070-brdlg\" (UID: \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.183249 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567070-wpppp"] Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.187300 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-wpppp" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.190219 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.191830 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.192022 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.198652 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg"] Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.218358 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-wpppp"] Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.281088 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67vzb\" (UniqueName: \"kubernetes.io/projected/8addf4a2-e310-4f92-9082-048c3cafca93-kube-api-access-67vzb\") pod \"auto-csr-approver-29567070-wpppp\" (UID: \"8addf4a2-e310-4f92-9082-048c3cafca93\") " pod="openshift-infra/auto-csr-approver-29567070-wpppp" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.281444 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrxm4\" (UniqueName: \"kubernetes.io/projected/ee8e1026-b0fd-477d-94f0-9d577115e2ba-kube-api-access-hrxm4\") pod \"collect-profiles-29567070-brdlg\" (UID: \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.281556 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee8e1026-b0fd-477d-94f0-9d577115e2ba-config-volume\") pod \"collect-profiles-29567070-brdlg\" (UID: \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.281732 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee8e1026-b0fd-477d-94f0-9d577115e2ba-secret-volume\") pod \"collect-profiles-29567070-brdlg\" (UID: \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.282489 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee8e1026-b0fd-477d-94f0-9d577115e2ba-config-volume\") pod \"collect-profiles-29567070-brdlg\" (UID: \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.287378 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee8e1026-b0fd-477d-94f0-9d577115e2ba-secret-volume\") pod \"collect-profiles-29567070-brdlg\" (UID: \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.296520 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrxm4\" (UniqueName: \"kubernetes.io/projected/ee8e1026-b0fd-477d-94f0-9d577115e2ba-kube-api-access-hrxm4\") pod \"collect-profiles-29567070-brdlg\" (UID: \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.383660 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67vzb\" (UniqueName: \"kubernetes.io/projected/8addf4a2-e310-4f92-9082-048c3cafca93-kube-api-access-67vzb\") pod \"auto-csr-approver-29567070-wpppp\" (UID: \"8addf4a2-e310-4f92-9082-048c3cafca93\") " pod="openshift-infra/auto-csr-approver-29567070-wpppp" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.401721 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67vzb\" (UniqueName: \"kubernetes.io/projected/8addf4a2-e310-4f92-9082-048c3cafca93-kube-api-access-67vzb\") pod \"auto-csr-approver-29567070-wpppp\" (UID: \"8addf4a2-e310-4f92-9082-048c3cafca93\") " pod="openshift-infra/auto-csr-approver-29567070-wpppp" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.486646 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.513102 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-wpppp" Mar 20 16:30:00 crc kubenswrapper[4708]: I0320 16:30:00.993587 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg"] Mar 20 16:30:01 crc kubenswrapper[4708]: I0320 16:30:01.057303 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-wpppp"] Mar 20 16:30:01 crc kubenswrapper[4708]: I0320 16:30:01.065974 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" event={"ID":"ee8e1026-b0fd-477d-94f0-9d577115e2ba","Type":"ContainerStarted","Data":"6e2fd298a83c9ff72ddb42ab724fca0b000e7b2a31db62f7b8fd386959a4f6cd"} Mar 20 16:30:01 crc kubenswrapper[4708]: W0320 16:30:01.067477 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8addf4a2_e310_4f92_9082_048c3cafca93.slice/crio-ff0b49cfa6efc080530b43cdd3b5cf1826c04e9475fe24c2a2553641eb7c2f2f WatchSource:0}: Error finding container ff0b49cfa6efc080530b43cdd3b5cf1826c04e9475fe24c2a2553641eb7c2f2f: Status 404 returned error can't find the container with id ff0b49cfa6efc080530b43cdd3b5cf1826c04e9475fe24c2a2553641eb7c2f2f Mar 20 16:30:02 crc kubenswrapper[4708]: I0320 16:30:02.080103 4708 generic.go:334] "Generic (PLEG): container finished" podID="ee8e1026-b0fd-477d-94f0-9d577115e2ba" containerID="27e9a33df176e7a4764a0af03750ecf04273f8be22d4813182eac59b5b5d9b0e" exitCode=0 Mar 20 16:30:02 crc kubenswrapper[4708]: I0320 16:30:02.080275 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" event={"ID":"ee8e1026-b0fd-477d-94f0-9d577115e2ba","Type":"ContainerDied","Data":"27e9a33df176e7a4764a0af03750ecf04273f8be22d4813182eac59b5b5d9b0e"} Mar 20 16:30:02 crc kubenswrapper[4708]: I0320 16:30:02.083050 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567070-wpppp" event={"ID":"8addf4a2-e310-4f92-9082-048c3cafca93","Type":"ContainerStarted","Data":"ff0b49cfa6efc080530b43cdd3b5cf1826c04e9475fe24c2a2553641eb7c2f2f"} Mar 20 16:30:03 crc kubenswrapper[4708]: I0320 16:30:03.496053 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:03 crc kubenswrapper[4708]: I0320 16:30:03.650208 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee8e1026-b0fd-477d-94f0-9d577115e2ba-secret-volume\") pod \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\" (UID: \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\") " Mar 20 16:30:03 crc kubenswrapper[4708]: I0320 16:30:03.650449 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee8e1026-b0fd-477d-94f0-9d577115e2ba-config-volume\") pod \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\" (UID: \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\") " Mar 20 16:30:03 crc kubenswrapper[4708]: I0320 16:30:03.650546 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrxm4\" (UniqueName: \"kubernetes.io/projected/ee8e1026-b0fd-477d-94f0-9d577115e2ba-kube-api-access-hrxm4\") pod \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\" (UID: \"ee8e1026-b0fd-477d-94f0-9d577115e2ba\") " Mar 20 16:30:03 crc kubenswrapper[4708]: I0320 16:30:03.652343 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8e1026-b0fd-477d-94f0-9d577115e2ba-config-volume" (OuterVolumeSpecName: "config-volume") pod "ee8e1026-b0fd-477d-94f0-9d577115e2ba" (UID: "ee8e1026-b0fd-477d-94f0-9d577115e2ba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:30:03 crc kubenswrapper[4708]: I0320 16:30:03.657053 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee8e1026-b0fd-477d-94f0-9d577115e2ba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ee8e1026-b0fd-477d-94f0-9d577115e2ba" (UID: "ee8e1026-b0fd-477d-94f0-9d577115e2ba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:30:03 crc kubenswrapper[4708]: I0320 16:30:03.658954 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee8e1026-b0fd-477d-94f0-9d577115e2ba-kube-api-access-hrxm4" (OuterVolumeSpecName: "kube-api-access-hrxm4") pod "ee8e1026-b0fd-477d-94f0-9d577115e2ba" (UID: "ee8e1026-b0fd-477d-94f0-9d577115e2ba"). InnerVolumeSpecName "kube-api-access-hrxm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:30:03 crc kubenswrapper[4708]: I0320 16:30:03.753102 4708 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee8e1026-b0fd-477d-94f0-9d577115e2ba-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:03 crc kubenswrapper[4708]: I0320 16:30:03.753143 4708 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee8e1026-b0fd-477d-94f0-9d577115e2ba-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:03 crc kubenswrapper[4708]: I0320 16:30:03.753155 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrxm4\" (UniqueName: \"kubernetes.io/projected/ee8e1026-b0fd-477d-94f0-9d577115e2ba-kube-api-access-hrxm4\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:04 crc kubenswrapper[4708]: I0320 16:30:04.104539 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" event={"ID":"ee8e1026-b0fd-477d-94f0-9d577115e2ba","Type":"ContainerDied","Data":"6e2fd298a83c9ff72ddb42ab724fca0b000e7b2a31db62f7b8fd386959a4f6cd"} Mar 20 16:30:04 crc kubenswrapper[4708]: I0320 16:30:04.104590 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e2fd298a83c9ff72ddb42ab724fca0b000e7b2a31db62f7b8fd386959a4f6cd" Mar 20 16:30:04 crc kubenswrapper[4708]: I0320 16:30:04.104646 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-brdlg" Mar 20 16:30:05 crc kubenswrapper[4708]: I0320 16:30:05.115976 4708 generic.go:334] "Generic (PLEG): container finished" podID="8addf4a2-e310-4f92-9082-048c3cafca93" containerID="35d464957d426bd7fa70d76ce54d9862d82c089c5181510e30d9c4ac4dd9d9bf" exitCode=0 Mar 20 16:30:05 crc kubenswrapper[4708]: I0320 16:30:05.116073 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567070-wpppp" event={"ID":"8addf4a2-e310-4f92-9082-048c3cafca93","Type":"ContainerDied","Data":"35d464957d426bd7fa70d76ce54d9862d82c089c5181510e30d9c4ac4dd9d9bf"} Mar 20 16:30:06 crc kubenswrapper[4708]: I0320 16:30:06.541757 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-wpppp" Mar 20 16:30:06 crc kubenswrapper[4708]: I0320 16:30:06.711890 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67vzb\" (UniqueName: \"kubernetes.io/projected/8addf4a2-e310-4f92-9082-048c3cafca93-kube-api-access-67vzb\") pod \"8addf4a2-e310-4f92-9082-048c3cafca93\" (UID: \"8addf4a2-e310-4f92-9082-048c3cafca93\") " Mar 20 16:30:06 crc kubenswrapper[4708]: I0320 16:30:06.718553 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8addf4a2-e310-4f92-9082-048c3cafca93-kube-api-access-67vzb" (OuterVolumeSpecName: "kube-api-access-67vzb") pod "8addf4a2-e310-4f92-9082-048c3cafca93" (UID: "8addf4a2-e310-4f92-9082-048c3cafca93"). InnerVolumeSpecName "kube-api-access-67vzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:30:06 crc kubenswrapper[4708]: I0320 16:30:06.815275 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67vzb\" (UniqueName: \"kubernetes.io/projected/8addf4a2-e310-4f92-9082-048c3cafca93-kube-api-access-67vzb\") on node \"crc\" DevicePath \"\"" Mar 20 16:30:07 crc kubenswrapper[4708]: I0320 16:30:07.159793 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567070-wpppp" event={"ID":"8addf4a2-e310-4f92-9082-048c3cafca93","Type":"ContainerDied","Data":"ff0b49cfa6efc080530b43cdd3b5cf1826c04e9475fe24c2a2553641eb7c2f2f"} Mar 20 16:30:07 crc kubenswrapper[4708]: I0320 16:30:07.159847 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff0b49cfa6efc080530b43cdd3b5cf1826c04e9475fe24c2a2553641eb7c2f2f" Mar 20 16:30:07 crc kubenswrapper[4708]: I0320 16:30:07.159932 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-wpppp" Mar 20 16:30:07 crc kubenswrapper[4708]: I0320 16:30:07.631392 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-kth9n"] Mar 20 16:30:07 crc kubenswrapper[4708]: I0320 16:30:07.645510 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-kth9n"] Mar 20 16:30:08 crc kubenswrapper[4708]: I0320 16:30:08.123094 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55f18481-391a-4e73-880c-f1d058ac7564" path="/var/lib/kubelet/pods/55f18481-391a-4e73-880c-f1d058ac7564/volumes" Mar 20 16:30:10 crc kubenswrapper[4708]: I0320 16:30:10.111499 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:30:10 crc kubenswrapper[4708]: E0320 16:30:10.112413 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:30:17 crc kubenswrapper[4708]: I0320 16:30:17.746024 4708 scope.go:117] "RemoveContainer" containerID="3b456a81cb7201f04a2346e2de43280f2de53da90db40ec10ff5b5e2e9c7b99c" Mar 20 16:30:17 crc kubenswrapper[4708]: I0320 16:30:17.780737 4708 scope.go:117] "RemoveContainer" containerID="27ba12f7d84ac8edb2c5b559c04d5ce79ae4491d1838c76577e0b6c59b8e095e" Mar 20 16:30:17 crc kubenswrapper[4708]: I0320 16:30:17.807650 4708 scope.go:117] "RemoveContainer" containerID="10e907f3de9388e78a11a657c0d845a24907fd75ef92953dd76b9a81580245e8" Mar 20 16:30:17 crc kubenswrapper[4708]: I0320 16:30:17.871406 4708 scope.go:117] "RemoveContainer" containerID="13cbad05c000901dcea7491f96b1f0073385a627c23a0dc556a431c8a31067d2" Mar 20 16:30:17 crc kubenswrapper[4708]: I0320 16:30:17.895965 4708 scope.go:117] "RemoveContainer" containerID="eb25e4e247261583daa1a69711542c6735a3453bd1ba79928a55fd1b90a9063d" Mar 20 16:30:17 crc kubenswrapper[4708]: I0320 16:30:17.921364 4708 scope.go:117] "RemoveContainer" containerID="0063a85d53b4de8c9ef6265aee8aea5aeedd224c1af6af410dfcfbaa21d2e673" Mar 20 16:30:17 crc kubenswrapper[4708]: I0320 16:30:17.944779 4708 scope.go:117] "RemoveContainer" containerID="e2d337afd8cd334dc8e3e63e3bb1b44fe7ebfdb48b3138de04d17ec742db897a" Mar 20 16:30:22 crc kubenswrapper[4708]: I0320 16:30:22.111660 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:30:22 crc kubenswrapper[4708]: E0320 16:30:22.112530 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:30:35 crc kubenswrapper[4708]: I0320 16:30:35.111170 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:30:35 crc kubenswrapper[4708]: E0320 16:30:35.112031 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:30:37 crc kubenswrapper[4708]: I0320 16:30:37.046947 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-930e-account-create-update-nzvsx"] Mar 20 16:30:37 crc kubenswrapper[4708]: I0320 16:30:37.059926 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-930e-account-create-update-nzvsx"] Mar 20 16:30:38 crc kubenswrapper[4708]: I0320 16:30:38.027397 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-h7q7q"] Mar 20 16:30:38 crc kubenswrapper[4708]: I0320 16:30:38.036547 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8c1b-account-create-update-r92gp"] Mar 20 16:30:38 crc kubenswrapper[4708]: I0320 16:30:38.048545 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-p756b"] Mar 20 16:30:38 crc kubenswrapper[4708]: I0320 16:30:38.059342 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-p756b"] Mar 20 16:30:38 crc kubenswrapper[4708]: I0320 16:30:38.070027 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-h7q7q"] Mar 20 16:30:38 crc kubenswrapper[4708]: I0320 16:30:38.080623 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8c1b-account-create-update-r92gp"] Mar 20 16:30:38 crc kubenswrapper[4708]: I0320 16:30:38.123226 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a31e27-283d-43b0-91b5-71c548d61e27" path="/var/lib/kubelet/pods/50a31e27-283d-43b0-91b5-71c548d61e27/volumes" Mar 20 16:30:38 crc kubenswrapper[4708]: I0320 16:30:38.123917 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="799c71bd-9b76-4d2d-a4b9-6953e7ee2863" path="/var/lib/kubelet/pods/799c71bd-9b76-4d2d-a4b9-6953e7ee2863/volumes" Mar 20 16:30:38 crc kubenswrapper[4708]: I0320 16:30:38.124542 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03712fd-33bf-454a-a7f7-f907e9b9c0ec" path="/var/lib/kubelet/pods/c03712fd-33bf-454a-a7f7-f907e9b9c0ec/volumes" Mar 20 16:30:38 crc kubenswrapper[4708]: I0320 16:30:38.125212 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e322f6-c3f7-4870-84ae-f27c1d4ba293" path="/var/lib/kubelet/pods/e5e322f6-c3f7-4870-84ae-f27c1d4ba293/volumes" Mar 20 16:30:41 crc kubenswrapper[4708]: I0320 16:30:41.029406 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3dc6-account-create-update-cqsnp"] Mar 20 16:30:41 crc kubenswrapper[4708]: I0320 16:30:41.037749 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-dw2jc"] Mar 20 16:30:41 crc kubenswrapper[4708]: I0320 16:30:41.046985 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3dc6-account-create-update-cqsnp"] Mar 20 16:30:41 crc kubenswrapper[4708]: I0320 16:30:41.055051 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-dw2jc"] Mar 20 16:30:42 crc kubenswrapper[4708]: I0320 16:30:42.127039 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5609997e-9b3b-4472-a91a-0948eacb77f1" path="/var/lib/kubelet/pods/5609997e-9b3b-4472-a91a-0948eacb77f1/volumes" Mar 20 16:30:42 crc kubenswrapper[4708]: I0320 16:30:42.128146 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720c033b-2069-47be-b543-00c6005b496b" path="/var/lib/kubelet/pods/720c033b-2069-47be-b543-00c6005b496b/volumes" Mar 20 16:30:43 crc kubenswrapper[4708]: I0320 16:30:43.039192 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bw6zk"] Mar 20 16:30:43 crc kubenswrapper[4708]: I0320 16:30:43.057361 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bw6zk"] Mar 20 16:30:44 crc kubenswrapper[4708]: I0320 16:30:44.121179 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c397eb69-e70b-4a8b-8c3f-162c06ccc6bc" path="/var/lib/kubelet/pods/c397eb69-e70b-4a8b-8c3f-162c06ccc6bc/volumes" Mar 20 16:30:47 crc kubenswrapper[4708]: I0320 16:30:47.111172 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:30:47 crc kubenswrapper[4708]: E0320 16:30:47.111752 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:30:58 crc kubenswrapper[4708]: I0320 16:30:58.615028 4708 generic.go:334] "Generic (PLEG): container finished" podID="ebe8ded9-92d5-473f-a111-9c4fea2091ba" containerID="252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542" exitCode=0 Mar 20 16:30:58 crc kubenswrapper[4708]: I0320 16:30:58.615199 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7pjs/must-gather-6mfnj" event={"ID":"ebe8ded9-92d5-473f-a111-9c4fea2091ba","Type":"ContainerDied","Data":"252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542"} Mar 20 16:30:58 crc kubenswrapper[4708]: I0320 16:30:58.616199 4708 scope.go:117] "RemoveContainer" containerID="252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542" Mar 20 16:30:59 crc kubenswrapper[4708]: I0320 16:30:59.111037 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:30:59 crc kubenswrapper[4708]: E0320 16:30:59.111605 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:30:59 crc kubenswrapper[4708]: I0320 16:30:59.389927 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g7pjs_must-gather-6mfnj_ebe8ded9-92d5-473f-a111-9c4fea2091ba/gather/0.log" Mar 20 16:31:06 crc kubenswrapper[4708]: I0320 16:31:06.869325 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g7pjs/must-gather-6mfnj"] Mar 20 16:31:06 crc kubenswrapper[4708]: I0320 16:31:06.870284 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-g7pjs/must-gather-6mfnj" podUID="ebe8ded9-92d5-473f-a111-9c4fea2091ba" containerName="copy" containerID="cri-o://dd94a813b7cea89cddbd798bd578bdb90dae7786499d40f5e05213cc54c7bfc0" gracePeriod=2 Mar 20 16:31:06 crc kubenswrapper[4708]: I0320 16:31:06.878656 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g7pjs/must-gather-6mfnj"] Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.277628 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g7pjs_must-gather-6mfnj_ebe8ded9-92d5-473f-a111-9c4fea2091ba/copy/0.log" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.278378 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7pjs/must-gather-6mfnj" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.418064 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebe8ded9-92d5-473f-a111-9c4fea2091ba-must-gather-output\") pod \"ebe8ded9-92d5-473f-a111-9c4fea2091ba\" (UID: \"ebe8ded9-92d5-473f-a111-9c4fea2091ba\") " Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.418153 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmhdd\" (UniqueName: \"kubernetes.io/projected/ebe8ded9-92d5-473f-a111-9c4fea2091ba-kube-api-access-hmhdd\") pod \"ebe8ded9-92d5-473f-a111-9c4fea2091ba\" (UID: \"ebe8ded9-92d5-473f-a111-9c4fea2091ba\") " Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.425514 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe8ded9-92d5-473f-a111-9c4fea2091ba-kube-api-access-hmhdd" (OuterVolumeSpecName: "kube-api-access-hmhdd") pod "ebe8ded9-92d5-473f-a111-9c4fea2091ba" (UID: "ebe8ded9-92d5-473f-a111-9c4fea2091ba"). InnerVolumeSpecName "kube-api-access-hmhdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.521064 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmhdd\" (UniqueName: \"kubernetes.io/projected/ebe8ded9-92d5-473f-a111-9c4fea2091ba-kube-api-access-hmhdd\") on node \"crc\" DevicePath \"\"" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.546303 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe8ded9-92d5-473f-a111-9c4fea2091ba-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ebe8ded9-92d5-473f-a111-9c4fea2091ba" (UID: "ebe8ded9-92d5-473f-a111-9c4fea2091ba"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.622768 4708 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebe8ded9-92d5-473f-a111-9c4fea2091ba-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.698959 4708 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g7pjs_must-gather-6mfnj_ebe8ded9-92d5-473f-a111-9c4fea2091ba/copy/0.log" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.699285 4708 generic.go:334] "Generic (PLEG): container finished" podID="ebe8ded9-92d5-473f-a111-9c4fea2091ba" containerID="dd94a813b7cea89cddbd798bd578bdb90dae7786499d40f5e05213cc54c7bfc0" exitCode=143 Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.699336 4708 scope.go:117] "RemoveContainer" containerID="dd94a813b7cea89cddbd798bd578bdb90dae7786499d40f5e05213cc54c7bfc0" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.699361 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7pjs/must-gather-6mfnj" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.738030 4708 scope.go:117] "RemoveContainer" containerID="252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.854735 4708 scope.go:117] "RemoveContainer" containerID="dd94a813b7cea89cddbd798bd578bdb90dae7786499d40f5e05213cc54c7bfc0" Mar 20 16:31:07 crc kubenswrapper[4708]: E0320 16:31:07.855261 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd94a813b7cea89cddbd798bd578bdb90dae7786499d40f5e05213cc54c7bfc0\": container with ID starting with dd94a813b7cea89cddbd798bd578bdb90dae7786499d40f5e05213cc54c7bfc0 not found: ID does not exist" containerID="dd94a813b7cea89cddbd798bd578bdb90dae7786499d40f5e05213cc54c7bfc0" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.855307 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd94a813b7cea89cddbd798bd578bdb90dae7786499d40f5e05213cc54c7bfc0"} err="failed to get container status \"dd94a813b7cea89cddbd798bd578bdb90dae7786499d40f5e05213cc54c7bfc0\": rpc error: code = NotFound desc = could not find container \"dd94a813b7cea89cddbd798bd578bdb90dae7786499d40f5e05213cc54c7bfc0\": container with ID starting with dd94a813b7cea89cddbd798bd578bdb90dae7786499d40f5e05213cc54c7bfc0 not found: ID does not exist" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.855334 4708 scope.go:117] "RemoveContainer" containerID="252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542" Mar 20 16:31:07 crc kubenswrapper[4708]: E0320 16:31:07.855774 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542\": container with ID starting with 252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542 not found: ID does not exist" containerID="252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542" Mar 20 16:31:07 crc kubenswrapper[4708]: I0320 16:31:07.855800 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542"} err="failed to get container status \"252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542\": rpc error: code = NotFound desc = could not find container \"252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542\": container with ID starting with 252e105f789eeb3a9dd6c17ffbe88e2aff33ade026d568d13a221038b2bb4542 not found: ID does not exist" Mar 20 16:31:08 crc kubenswrapper[4708]: I0320 16:31:08.121439 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe8ded9-92d5-473f-a111-9c4fea2091ba" path="/var/lib/kubelet/pods/ebe8ded9-92d5-473f-a111-9c4fea2091ba/volumes" Mar 20 16:31:09 crc kubenswrapper[4708]: I0320 16:31:09.039579 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qnn97"] Mar 20 16:31:09 crc kubenswrapper[4708]: I0320 16:31:09.049188 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-nbvxk"] Mar 20 16:31:09 crc kubenswrapper[4708]: I0320 16:31:09.061633 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-dg24d"] Mar 20 16:31:09 crc kubenswrapper[4708]: I0320 16:31:09.073471 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qnn97"] Mar 20 16:31:09 crc kubenswrapper[4708]: I0320 16:31:09.084627 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-42e7-account-create-update-jmvzr"] Mar 20 16:31:09 crc kubenswrapper[4708]: I0320 16:31:09.093757 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-nbvxk"] Mar 20 16:31:09 crc kubenswrapper[4708]: I0320 16:31:09.101482 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bed-account-create-update-8dhzm"] Mar 20 16:31:09 crc kubenswrapper[4708]: I0320 16:31:09.109238 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5bed-account-create-update-8dhzm"] Mar 20 16:31:09 crc kubenswrapper[4708]: I0320 16:31:09.116432 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-dg24d"] Mar 20 16:31:09 crc kubenswrapper[4708]: I0320 16:31:09.123687 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-42e7-account-create-update-jmvzr"] Mar 20 16:31:09 crc kubenswrapper[4708]: I0320 16:31:09.130728 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5805-account-create-update-2qn7m"] Mar 20 16:31:09 crc kubenswrapper[4708]: I0320 16:31:09.137683 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5805-account-create-update-2qn7m"] Mar 20 16:31:10 crc kubenswrapper[4708]: I0320 16:31:10.122421 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15badc78-5ab8-41aa-acfb-4bb1f28bcbab" path="/var/lib/kubelet/pods/15badc78-5ab8-41aa-acfb-4bb1f28bcbab/volumes" Mar 20 16:31:10 crc kubenswrapper[4708]: I0320 16:31:10.123406 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b15f38d-ded0-4fed-add0-c891d2208014" path="/var/lib/kubelet/pods/1b15f38d-ded0-4fed-add0-c891d2208014/volumes" Mar 20 16:31:10 crc kubenswrapper[4708]: I0320 16:31:10.123971 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247ec5aa-1401-422a-b9f7-71c8c9b4876e" path="/var/lib/kubelet/pods/247ec5aa-1401-422a-b9f7-71c8c9b4876e/volumes" Mar 20 16:31:10 crc kubenswrapper[4708]: I0320 16:31:10.124547 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9a8c73-828e-42ef-9818-6aab510e8240" path="/var/lib/kubelet/pods/3a9a8c73-828e-42ef-9818-6aab510e8240/volumes" Mar 20 16:31:10 crc kubenswrapper[4708]: I0320 16:31:10.125560 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e28ace2-9e11-4223-b58d-91688cd2ced4" path="/var/lib/kubelet/pods/4e28ace2-9e11-4223-b58d-91688cd2ced4/volumes" Mar 20 16:31:10 crc kubenswrapper[4708]: I0320 16:31:10.126107 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949acc8b-4603-4567-9864-0328462133a0" path="/var/lib/kubelet/pods/949acc8b-4603-4567-9864-0328462133a0/volumes" Mar 20 16:31:12 crc kubenswrapper[4708]: I0320 16:31:12.111274 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:31:12 crc kubenswrapper[4708]: E0320 16:31:12.111870 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:31:13 crc kubenswrapper[4708]: I0320 16:31:13.026055 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-z2bxx"] Mar 20 16:31:13 crc kubenswrapper[4708]: I0320 16:31:13.033856 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-z2bxx"] Mar 20 16:31:14 crc kubenswrapper[4708]: I0320 16:31:14.121817 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d5e7b3a-c1c7-493a-a587-19d751f038be" path="/var/lib/kubelet/pods/8d5e7b3a-c1c7-493a-a587-19d751f038be/volumes" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.060294 4708 scope.go:117] "RemoveContainer" containerID="075295be7d5f1e1262933d9308b0c5783279ff6465320622f55a2f0933c7df02" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.288161 4708 scope.go:117] "RemoveContainer" containerID="21f2c20e02291265dbd53b60bc1e391f02382442da3143ab1abc187e58b03aed" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.313928 4708 scope.go:117] "RemoveContainer" containerID="795384e065eb592d7a2cb0388d3a77603fdafcbc22b617d0066faf8b965f6be4" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.351944 4708 scope.go:117] "RemoveContainer" containerID="f5d844b29fa2ec17ac8e9eb25fea41a9b99ee25895f99a4f9891026e38e9cf75" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.404258 4708 scope.go:117] "RemoveContainer" containerID="f0a2395cbb3be1be4591e8307948992da95749400177c63c6be783deffd7426a" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.423340 4708 scope.go:117] "RemoveContainer" containerID="213575d6a25a26b34c0003211c4961a6918a6086451952f45af358a92d64fc51" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.475911 4708 scope.go:117] "RemoveContainer" containerID="2c0452b3fef5af38801341cd45e074b156bd620faca4a5a65af79d8cdcac09e5" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.507358 4708 scope.go:117] "RemoveContainer" containerID="16a13aff1fa1eacbf4b6a8eb05485a0bd31a5f831900c3ef98965ab8c09788f5" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.533472 4708 scope.go:117] "RemoveContainer" containerID="0643e6442963881271a1a07da2269fb37ccd6145883356b6740dd9f138bd7230" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.553108 4708 scope.go:117] "RemoveContainer" containerID="8fbe54b40e5f5c97037f62b168a6ff3392879081a253eee9d30bac66ba93c773" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.577405 4708 scope.go:117] "RemoveContainer" containerID="229e8472d38db4b9e53d22a48ee4fd5606a52ed4c5b741cfc1a769a9e9dd728b" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.605794 4708 scope.go:117] "RemoveContainer" containerID="b5a50a5b9ae667b828ddcee663d732f69dd3e8ff3e48202185ac4b1a3015e7d0" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.650199 4708 scope.go:117] "RemoveContainer" containerID="317905a8e042638c9a6a84eb3de76e71f19b23e7e8391b29d27b4d5baa308467" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.670477 4708 scope.go:117] "RemoveContainer" containerID="82f81a2906e9aafa474771012c56f045dcf7464bb996d8872ecd4aff4e0da9ec" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.696212 4708 scope.go:117] "RemoveContainer" containerID="b9631cc0460a6cf01a43dcf7e777b24acea1d0e2a5016feaa71440eec26f3164" Mar 20 16:31:18 crc kubenswrapper[4708]: I0320 16:31:18.715167 4708 scope.go:117] "RemoveContainer" containerID="30da198c7b5099cb93e2396975abb01d064840c33daffa22b7fc8b0ec08090b0" Mar 20 16:31:24 crc kubenswrapper[4708]: I0320 16:31:24.111404 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:31:24 crc kubenswrapper[4708]: E0320 16:31:24.112775 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:31:35 crc kubenswrapper[4708]: I0320 16:31:35.111797 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:31:35 crc kubenswrapper[4708]: E0320 16:31:35.112527 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.557369 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zrzrd"] Mar 20 16:31:46 crc kubenswrapper[4708]: E0320 16:31:46.558427 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe8ded9-92d5-473f-a111-9c4fea2091ba" containerName="gather" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.558446 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe8ded9-92d5-473f-a111-9c4fea2091ba" containerName="gather" Mar 20 16:31:46 crc kubenswrapper[4708]: E0320 16:31:46.558463 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe8ded9-92d5-473f-a111-9c4fea2091ba" containerName="copy" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.558470 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe8ded9-92d5-473f-a111-9c4fea2091ba" containerName="copy" Mar 20 16:31:46 crc kubenswrapper[4708]: E0320 16:31:46.558501 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8addf4a2-e310-4f92-9082-048c3cafca93" containerName="oc" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.558511 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="8addf4a2-e310-4f92-9082-048c3cafca93" containerName="oc" Mar 20 16:31:46 crc kubenswrapper[4708]: E0320 16:31:46.558528 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8e1026-b0fd-477d-94f0-9d577115e2ba" containerName="collect-profiles" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.558538 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8e1026-b0fd-477d-94f0-9d577115e2ba" containerName="collect-profiles" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.558793 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe8ded9-92d5-473f-a111-9c4fea2091ba" containerName="gather" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.558833 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="8addf4a2-e310-4f92-9082-048c3cafca93" containerName="oc" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.558842 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee8e1026-b0fd-477d-94f0-9d577115e2ba" containerName="collect-profiles" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.558852 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe8ded9-92d5-473f-a111-9c4fea2091ba" containerName="copy" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.560505 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.569370 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrzrd"] Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.636663 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cjxx\" (UniqueName: \"kubernetes.io/projected/127f3d9f-81ae-48d4-ac16-0dba3912296e-kube-api-access-7cjxx\") pod \"community-operators-zrzrd\" (UID: \"127f3d9f-81ae-48d4-ac16-0dba3912296e\") " pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.637066 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127f3d9f-81ae-48d4-ac16-0dba3912296e-catalog-content\") pod \"community-operators-zrzrd\" (UID: \"127f3d9f-81ae-48d4-ac16-0dba3912296e\") " pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.637198 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127f3d9f-81ae-48d4-ac16-0dba3912296e-utilities\") pod \"community-operators-zrzrd\" (UID: \"127f3d9f-81ae-48d4-ac16-0dba3912296e\") " pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.739078 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127f3d9f-81ae-48d4-ac16-0dba3912296e-catalog-content\") pod \"community-operators-zrzrd\" (UID: \"127f3d9f-81ae-48d4-ac16-0dba3912296e\") " pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.739151 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127f3d9f-81ae-48d4-ac16-0dba3912296e-utilities\") pod \"community-operators-zrzrd\" (UID: \"127f3d9f-81ae-48d4-ac16-0dba3912296e\") " pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.739239 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cjxx\" (UniqueName: \"kubernetes.io/projected/127f3d9f-81ae-48d4-ac16-0dba3912296e-kube-api-access-7cjxx\") pod \"community-operators-zrzrd\" (UID: \"127f3d9f-81ae-48d4-ac16-0dba3912296e\") " pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.739813 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127f3d9f-81ae-48d4-ac16-0dba3912296e-catalog-content\") pod \"community-operators-zrzrd\" (UID: \"127f3d9f-81ae-48d4-ac16-0dba3912296e\") " pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.739862 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127f3d9f-81ae-48d4-ac16-0dba3912296e-utilities\") pod \"community-operators-zrzrd\" (UID: \"127f3d9f-81ae-48d4-ac16-0dba3912296e\") " pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.766291 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cjxx\" (UniqueName: \"kubernetes.io/projected/127f3d9f-81ae-48d4-ac16-0dba3912296e-kube-api-access-7cjxx\") pod \"community-operators-zrzrd\" (UID: \"127f3d9f-81ae-48d4-ac16-0dba3912296e\") " pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:46 crc kubenswrapper[4708]: I0320 16:31:46.928878 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:47 crc kubenswrapper[4708]: I0320 16:31:47.050437 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-rpnp8"] Mar 20 16:31:47 crc kubenswrapper[4708]: I0320 16:31:47.068357 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-rpnp8"] Mar 20 16:31:47 crc kubenswrapper[4708]: I0320 16:31:47.440768 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrzrd"] Mar 20 16:31:48 crc kubenswrapper[4708]: I0320 16:31:48.090864 4708 generic.go:334] "Generic (PLEG): container finished" podID="127f3d9f-81ae-48d4-ac16-0dba3912296e" containerID="298e615ea740e2902beb9c0ff5641b12ef3d731566ced661ea1ec74e851b51fe" exitCode=0 Mar 20 16:31:48 crc kubenswrapper[4708]: I0320 16:31:48.090967 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrzrd" event={"ID":"127f3d9f-81ae-48d4-ac16-0dba3912296e","Type":"ContainerDied","Data":"298e615ea740e2902beb9c0ff5641b12ef3d731566ced661ea1ec74e851b51fe"} Mar 20 16:31:48 crc kubenswrapper[4708]: I0320 16:31:48.091190 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrzrd" event={"ID":"127f3d9f-81ae-48d4-ac16-0dba3912296e","Type":"ContainerStarted","Data":"be5bc6b88f32e59e880ab206e9634bc7185df0c6b8e3981d7e38d943dc76183f"} Mar 20 16:31:48 crc kubenswrapper[4708]: I0320 16:31:48.092787 4708 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:31:48 crc kubenswrapper[4708]: I0320 16:31:48.121126 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e73e6a53-ccd4-45bf-ad96-a6de1e696888" path="/var/lib/kubelet/pods/e73e6a53-ccd4-45bf-ad96-a6de1e696888/volumes" Mar 20 16:31:49 crc kubenswrapper[4708]: I0320 16:31:49.110699 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:31:49 crc kubenswrapper[4708]: E0320 16:31:49.111273 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.110710 4708 generic.go:334] "Generic (PLEG): container finished" podID="127f3d9f-81ae-48d4-ac16-0dba3912296e" containerID="c89f49b3c4db1a5b5ba4f10f99449ca0276a04f7b448f83c7de930185cf3a857" exitCode=0 Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.138874 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrzrd" event={"ID":"127f3d9f-81ae-48d4-ac16-0dba3912296e","Type":"ContainerDied","Data":"c89f49b3c4db1a5b5ba4f10f99449ca0276a04f7b448f83c7de930185cf3a857"} Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.165414 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nhv8w"] Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.167561 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.177656 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nhv8w"] Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.314944 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f98a64-a6b6-4d85-be38-77d4327c8782-utilities\") pod \"redhat-operators-nhv8w\" (UID: \"b3f98a64-a6b6-4d85-be38-77d4327c8782\") " pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.315002 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f98a64-a6b6-4d85-be38-77d4327c8782-catalog-content\") pod \"redhat-operators-nhv8w\" (UID: \"b3f98a64-a6b6-4d85-be38-77d4327c8782\") " pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.315271 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrstp\" (UniqueName: \"kubernetes.io/projected/b3f98a64-a6b6-4d85-be38-77d4327c8782-kube-api-access-hrstp\") pod \"redhat-operators-nhv8w\" (UID: \"b3f98a64-a6b6-4d85-be38-77d4327c8782\") " pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.416779 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f98a64-a6b6-4d85-be38-77d4327c8782-utilities\") pod \"redhat-operators-nhv8w\" (UID: \"b3f98a64-a6b6-4d85-be38-77d4327c8782\") " pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.416830 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f98a64-a6b6-4d85-be38-77d4327c8782-catalog-content\") pod \"redhat-operators-nhv8w\" (UID: \"b3f98a64-a6b6-4d85-be38-77d4327c8782\") " pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.416981 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrstp\" (UniqueName: \"kubernetes.io/projected/b3f98a64-a6b6-4d85-be38-77d4327c8782-kube-api-access-hrstp\") pod \"redhat-operators-nhv8w\" (UID: \"b3f98a64-a6b6-4d85-be38-77d4327c8782\") " pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.417880 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f98a64-a6b6-4d85-be38-77d4327c8782-utilities\") pod \"redhat-operators-nhv8w\" (UID: \"b3f98a64-a6b6-4d85-be38-77d4327c8782\") " pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.418102 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f98a64-a6b6-4d85-be38-77d4327c8782-catalog-content\") pod \"redhat-operators-nhv8w\" (UID: \"b3f98a64-a6b6-4d85-be38-77d4327c8782\") " pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.438972 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrstp\" (UniqueName: \"kubernetes.io/projected/b3f98a64-a6b6-4d85-be38-77d4327c8782-kube-api-access-hrstp\") pod \"redhat-operators-nhv8w\" (UID: \"b3f98a64-a6b6-4d85-be38-77d4327c8782\") " pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.489647 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:31:50 crc kubenswrapper[4708]: I0320 16:31:50.955606 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nhv8w"] Mar 20 16:31:50 crc kubenswrapper[4708]: W0320 16:31:50.956848 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3f98a64_a6b6_4d85_be38_77d4327c8782.slice/crio-349ca748629f87f013e18b81ed59db4f4ee2bbdb9776c5541e724f486a18af5d WatchSource:0}: Error finding container 349ca748629f87f013e18b81ed59db4f4ee2bbdb9776c5541e724f486a18af5d: Status 404 returned error can't find the container with id 349ca748629f87f013e18b81ed59db4f4ee2bbdb9776c5541e724f486a18af5d Mar 20 16:31:51 crc kubenswrapper[4708]: I0320 16:31:51.122699 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhv8w" event={"ID":"b3f98a64-a6b6-4d85-be38-77d4327c8782","Type":"ContainerStarted","Data":"27c08a91d838cbafc43b32f424711b7a3e8b0db312cf98993a30153580218165"} Mar 20 16:31:51 crc kubenswrapper[4708]: I0320 16:31:51.122746 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhv8w" event={"ID":"b3f98a64-a6b6-4d85-be38-77d4327c8782","Type":"ContainerStarted","Data":"349ca748629f87f013e18b81ed59db4f4ee2bbdb9776c5541e724f486a18af5d"} Mar 20 16:31:51 crc kubenswrapper[4708]: I0320 16:31:51.128279 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrzrd" event={"ID":"127f3d9f-81ae-48d4-ac16-0dba3912296e","Type":"ContainerStarted","Data":"7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b"} Mar 20 16:31:51 crc kubenswrapper[4708]: I0320 16:31:51.156197 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zrzrd" podStartSLOduration=2.397360212 podStartE2EDuration="5.156170588s" podCreationTimestamp="2026-03-20 16:31:46 +0000 UTC" firstStartedPulling="2026-03-20 16:31:48.092508718 +0000 UTC m=+1862.766845433" lastFinishedPulling="2026-03-20 16:31:50.851319094 +0000 UTC m=+1865.525655809" observedRunningTime="2026-03-20 16:31:51.144962828 +0000 UTC m=+1865.819299543" watchObservedRunningTime="2026-03-20 16:31:51.156170588 +0000 UTC m=+1865.830507293" Mar 20 16:31:52 crc kubenswrapper[4708]: I0320 16:31:52.142808 4708 generic.go:334] "Generic (PLEG): container finished" podID="b3f98a64-a6b6-4d85-be38-77d4327c8782" containerID="27c08a91d838cbafc43b32f424711b7a3e8b0db312cf98993a30153580218165" exitCode=0 Mar 20 16:31:52 crc kubenswrapper[4708]: I0320 16:31:52.142856 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhv8w" event={"ID":"b3f98a64-a6b6-4d85-be38-77d4327c8782","Type":"ContainerDied","Data":"27c08a91d838cbafc43b32f424711b7a3e8b0db312cf98993a30153580218165"} Mar 20 16:31:53 crc kubenswrapper[4708]: I0320 16:31:53.151968 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhv8w" event={"ID":"b3f98a64-a6b6-4d85-be38-77d4327c8782","Type":"ContainerStarted","Data":"6a3ee58015bab92dbbb4c20ecf7783b854cd301fb2582f7da0dea835e10be926"} Mar 20 16:31:54 crc kubenswrapper[4708]: I0320 16:31:54.041568 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-7q7vc"] Mar 20 16:31:54 crc kubenswrapper[4708]: I0320 16:31:54.049298 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-7q7vc"] Mar 20 16:31:54 crc kubenswrapper[4708]: I0320 16:31:54.121602 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ded2837-c536-490b-a13c-2a09ea07a7aa" path="/var/lib/kubelet/pods/3ded2837-c536-490b-a13c-2a09ea07a7aa/volumes" Mar 20 16:31:54 crc kubenswrapper[4708]: I0320 16:31:54.162994 4708 generic.go:334] "Generic (PLEG): container finished" podID="b3f98a64-a6b6-4d85-be38-77d4327c8782" containerID="6a3ee58015bab92dbbb4c20ecf7783b854cd301fb2582f7da0dea835e10be926" exitCode=0 Mar 20 16:31:54 crc kubenswrapper[4708]: I0320 16:31:54.163062 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhv8w" event={"ID":"b3f98a64-a6b6-4d85-be38-77d4327c8782","Type":"ContainerDied","Data":"6a3ee58015bab92dbbb4c20ecf7783b854cd301fb2582f7da0dea835e10be926"} Mar 20 16:31:55 crc kubenswrapper[4708]: I0320 16:31:55.175610 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhv8w" event={"ID":"b3f98a64-a6b6-4d85-be38-77d4327c8782","Type":"ContainerStarted","Data":"fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f"} Mar 20 16:31:55 crc kubenswrapper[4708]: I0320 16:31:55.198135 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nhv8w" podStartSLOduration=2.546548374 podStartE2EDuration="5.198114018s" podCreationTimestamp="2026-03-20 16:31:50 +0000 UTC" firstStartedPulling="2026-03-20 16:31:52.144764063 +0000 UTC m=+1866.819100778" lastFinishedPulling="2026-03-20 16:31:54.796329707 +0000 UTC m=+1869.470666422" observedRunningTime="2026-03-20 16:31:55.195249389 +0000 UTC m=+1869.869586104" watchObservedRunningTime="2026-03-20 16:31:55.198114018 +0000 UTC m=+1869.872450743" Mar 20 16:31:56 crc kubenswrapper[4708]: I0320 16:31:56.929359 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:56 crc kubenswrapper[4708]: I0320 16:31:56.929410 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:56 crc kubenswrapper[4708]: I0320 16:31:56.975573 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:57 crc kubenswrapper[4708]: I0320 16:31:57.039271 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-q5htg"] Mar 20 16:31:57 crc kubenswrapper[4708]: I0320 16:31:57.051183 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-sr6vd"] Mar 20 16:31:57 crc kubenswrapper[4708]: I0320 16:31:57.061483 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-q5htg"] Mar 20 16:31:57 crc kubenswrapper[4708]: I0320 16:31:57.069190 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-sr6vd"] Mar 20 16:31:57 crc kubenswrapper[4708]: I0320 16:31:57.236365 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:31:57 crc kubenswrapper[4708]: I0320 16:31:57.753476 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrzrd"] Mar 20 16:31:58 crc kubenswrapper[4708]: I0320 16:31:58.120333 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="341b59c3-684f-45e4-9d42-ed258e0e671b" path="/var/lib/kubelet/pods/341b59c3-684f-45e4-9d42-ed258e0e671b/volumes" Mar 20 16:31:58 crc kubenswrapper[4708]: I0320 16:31:58.120906 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e4d34b-0c95-475c-b9e5-be1dff27d5a3" path="/var/lib/kubelet/pods/52e4d34b-0c95-475c-b9e5-be1dff27d5a3/volumes" Mar 20 16:31:59 crc kubenswrapper[4708]: I0320 16:31:59.204882 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zrzrd" podUID="127f3d9f-81ae-48d4-ac16-0dba3912296e" containerName="registry-server" containerID="cri-o://7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b" gracePeriod=2 Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.111009 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:32:00 crc kubenswrapper[4708]: E0320 16:32:00.111245 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.162469 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567072-kp2h6"] Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.165270 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-kp2h6" Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.168916 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.169105 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.169208 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.176557 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567072-kp2h6"] Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.303089 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shvhr\" (UniqueName: \"kubernetes.io/projected/efb76c0f-8f46-4c9f-b25b-62cc462c9f43-kube-api-access-shvhr\") pod \"auto-csr-approver-29567072-kp2h6\" (UID: \"efb76c0f-8f46-4c9f-b25b-62cc462c9f43\") " pod="openshift-infra/auto-csr-approver-29567072-kp2h6" Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.404865 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shvhr\" (UniqueName: \"kubernetes.io/projected/efb76c0f-8f46-4c9f-b25b-62cc462c9f43-kube-api-access-shvhr\") pod \"auto-csr-approver-29567072-kp2h6\" (UID: \"efb76c0f-8f46-4c9f-b25b-62cc462c9f43\") " pod="openshift-infra/auto-csr-approver-29567072-kp2h6" Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.424693 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shvhr\" (UniqueName: \"kubernetes.io/projected/efb76c0f-8f46-4c9f-b25b-62cc462c9f43-kube-api-access-shvhr\") pod \"auto-csr-approver-29567072-kp2h6\" (UID: \"efb76c0f-8f46-4c9f-b25b-62cc462c9f43\") " pod="openshift-infra/auto-csr-approver-29567072-kp2h6" Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.501601 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-kp2h6" Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.503288 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.506123 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:32:00 crc kubenswrapper[4708]: I0320 16:32:00.984564 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567072-kp2h6"] Mar 20 16:32:00 crc kubenswrapper[4708]: W0320 16:32:00.989320 4708 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefb76c0f_8f46_4c9f_b25b_62cc462c9f43.slice/crio-8480948f1e3bae12c86b29b77dea9462d59e9917aece05241d11c6f416afb9e3 WatchSource:0}: Error finding container 8480948f1e3bae12c86b29b77dea9462d59e9917aece05241d11c6f416afb9e3: Status 404 returned error can't find the container with id 8480948f1e3bae12c86b29b77dea9462d59e9917aece05241d11c6f416afb9e3 Mar 20 16:32:01 crc kubenswrapper[4708]: I0320 16:32:01.222782 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567072-kp2h6" event={"ID":"efb76c0f-8f46-4c9f-b25b-62cc462c9f43","Type":"ContainerStarted","Data":"8480948f1e3bae12c86b29b77dea9462d59e9917aece05241d11c6f416afb9e3"} Mar 20 16:32:01 crc kubenswrapper[4708]: I0320 16:32:01.560535 4708 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nhv8w" podUID="b3f98a64-a6b6-4d85-be38-77d4327c8782" containerName="registry-server" probeResult="failure" output=< Mar 20 16:32:01 crc kubenswrapper[4708]: timeout: failed to connect service ":50051" within 1s Mar 20 16:32:01 crc kubenswrapper[4708]: > Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.039362 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.233187 4708 generic.go:334] "Generic (PLEG): container finished" podID="127f3d9f-81ae-48d4-ac16-0dba3912296e" containerID="7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b" exitCode=0 Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.233285 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrzrd" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.233274 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrzrd" event={"ID":"127f3d9f-81ae-48d4-ac16-0dba3912296e","Type":"ContainerDied","Data":"7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b"} Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.233643 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrzrd" event={"ID":"127f3d9f-81ae-48d4-ac16-0dba3912296e","Type":"ContainerDied","Data":"be5bc6b88f32e59e880ab206e9634bc7185df0c6b8e3981d7e38d943dc76183f"} Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.233661 4708 scope.go:117] "RemoveContainer" containerID="7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.241680 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127f3d9f-81ae-48d4-ac16-0dba3912296e-utilities\") pod \"127f3d9f-81ae-48d4-ac16-0dba3912296e\" (UID: \"127f3d9f-81ae-48d4-ac16-0dba3912296e\") " Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.241921 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127f3d9f-81ae-48d4-ac16-0dba3912296e-catalog-content\") pod \"127f3d9f-81ae-48d4-ac16-0dba3912296e\" (UID: \"127f3d9f-81ae-48d4-ac16-0dba3912296e\") " Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.242072 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cjxx\" (UniqueName: \"kubernetes.io/projected/127f3d9f-81ae-48d4-ac16-0dba3912296e-kube-api-access-7cjxx\") pod \"127f3d9f-81ae-48d4-ac16-0dba3912296e\" (UID: \"127f3d9f-81ae-48d4-ac16-0dba3912296e\") " Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.242816 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/127f3d9f-81ae-48d4-ac16-0dba3912296e-utilities" (OuterVolumeSpecName: "utilities") pod "127f3d9f-81ae-48d4-ac16-0dba3912296e" (UID: "127f3d9f-81ae-48d4-ac16-0dba3912296e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.245628 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127f3d9f-81ae-48d4-ac16-0dba3912296e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.247157 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/127f3d9f-81ae-48d4-ac16-0dba3912296e-kube-api-access-7cjxx" (OuterVolumeSpecName: "kube-api-access-7cjxx") pod "127f3d9f-81ae-48d4-ac16-0dba3912296e" (UID: "127f3d9f-81ae-48d4-ac16-0dba3912296e"). InnerVolumeSpecName "kube-api-access-7cjxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.255893 4708 scope.go:117] "RemoveContainer" containerID="c89f49b3c4db1a5b5ba4f10f99449ca0276a04f7b448f83c7de930185cf3a857" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.293516 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/127f3d9f-81ae-48d4-ac16-0dba3912296e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "127f3d9f-81ae-48d4-ac16-0dba3912296e" (UID: "127f3d9f-81ae-48d4-ac16-0dba3912296e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.347223 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127f3d9f-81ae-48d4-ac16-0dba3912296e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.347269 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cjxx\" (UniqueName: \"kubernetes.io/projected/127f3d9f-81ae-48d4-ac16-0dba3912296e-kube-api-access-7cjxx\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.363954 4708 scope.go:117] "RemoveContainer" containerID="298e615ea740e2902beb9c0ff5641b12ef3d731566ced661ea1ec74e851b51fe" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.406317 4708 scope.go:117] "RemoveContainer" containerID="7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b" Mar 20 16:32:02 crc kubenswrapper[4708]: E0320 16:32:02.406837 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b\": container with ID starting with 7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b not found: ID does not exist" containerID="7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.406884 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b"} err="failed to get container status \"7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b\": rpc error: code = NotFound desc = could not find container \"7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b\": container with ID starting with 7cfdf13cf3639400aed0ce11949d16ca9a497d6e640508dcbbd2721c9abad12b not found: ID does not exist" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.406911 4708 scope.go:117] "RemoveContainer" containerID="c89f49b3c4db1a5b5ba4f10f99449ca0276a04f7b448f83c7de930185cf3a857" Mar 20 16:32:02 crc kubenswrapper[4708]: E0320 16:32:02.407164 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89f49b3c4db1a5b5ba4f10f99449ca0276a04f7b448f83c7de930185cf3a857\": container with ID starting with c89f49b3c4db1a5b5ba4f10f99449ca0276a04f7b448f83c7de930185cf3a857 not found: ID does not exist" containerID="c89f49b3c4db1a5b5ba4f10f99449ca0276a04f7b448f83c7de930185cf3a857" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.407192 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89f49b3c4db1a5b5ba4f10f99449ca0276a04f7b448f83c7de930185cf3a857"} err="failed to get container status \"c89f49b3c4db1a5b5ba4f10f99449ca0276a04f7b448f83c7de930185cf3a857\": rpc error: code = NotFound desc = could not find container \"c89f49b3c4db1a5b5ba4f10f99449ca0276a04f7b448f83c7de930185cf3a857\": container with ID starting with c89f49b3c4db1a5b5ba4f10f99449ca0276a04f7b448f83c7de930185cf3a857 not found: ID does not exist" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.407207 4708 scope.go:117] "RemoveContainer" containerID="298e615ea740e2902beb9c0ff5641b12ef3d731566ced661ea1ec74e851b51fe" Mar 20 16:32:02 crc kubenswrapper[4708]: E0320 16:32:02.407453 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"298e615ea740e2902beb9c0ff5641b12ef3d731566ced661ea1ec74e851b51fe\": container with ID starting with 298e615ea740e2902beb9c0ff5641b12ef3d731566ced661ea1ec74e851b51fe not found: ID does not exist" containerID="298e615ea740e2902beb9c0ff5641b12ef3d731566ced661ea1ec74e851b51fe" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.407478 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"298e615ea740e2902beb9c0ff5641b12ef3d731566ced661ea1ec74e851b51fe"} err="failed to get container status \"298e615ea740e2902beb9c0ff5641b12ef3d731566ced661ea1ec74e851b51fe\": rpc error: code = NotFound desc = could not find container \"298e615ea740e2902beb9c0ff5641b12ef3d731566ced661ea1ec74e851b51fe\": container with ID starting with 298e615ea740e2902beb9c0ff5641b12ef3d731566ced661ea1ec74e851b51fe not found: ID does not exist" Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.570183 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrzrd"] Mar 20 16:32:02 crc kubenswrapper[4708]: I0320 16:32:02.577351 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zrzrd"] Mar 20 16:32:03 crc kubenswrapper[4708]: I0320 16:32:03.244230 4708 generic.go:334] "Generic (PLEG): container finished" podID="efb76c0f-8f46-4c9f-b25b-62cc462c9f43" containerID="82bf83e3e94460bda9fda41e8b60836e4842367758da2671825309468faf2e8c" exitCode=0 Mar 20 16:32:03 crc kubenswrapper[4708]: I0320 16:32:03.244589 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567072-kp2h6" event={"ID":"efb76c0f-8f46-4c9f-b25b-62cc462c9f43","Type":"ContainerDied","Data":"82bf83e3e94460bda9fda41e8b60836e4842367758da2671825309468faf2e8c"} Mar 20 16:32:04 crc kubenswrapper[4708]: I0320 16:32:04.126662 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="127f3d9f-81ae-48d4-ac16-0dba3912296e" path="/var/lib/kubelet/pods/127f3d9f-81ae-48d4-ac16-0dba3912296e/volumes" Mar 20 16:32:04 crc kubenswrapper[4708]: I0320 16:32:04.601965 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-kp2h6" Mar 20 16:32:04 crc kubenswrapper[4708]: I0320 16:32:04.690555 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shvhr\" (UniqueName: \"kubernetes.io/projected/efb76c0f-8f46-4c9f-b25b-62cc462c9f43-kube-api-access-shvhr\") pod \"efb76c0f-8f46-4c9f-b25b-62cc462c9f43\" (UID: \"efb76c0f-8f46-4c9f-b25b-62cc462c9f43\") " Mar 20 16:32:04 crc kubenswrapper[4708]: I0320 16:32:04.695968 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb76c0f-8f46-4c9f-b25b-62cc462c9f43-kube-api-access-shvhr" (OuterVolumeSpecName: "kube-api-access-shvhr") pod "efb76c0f-8f46-4c9f-b25b-62cc462c9f43" (UID: "efb76c0f-8f46-4c9f-b25b-62cc462c9f43"). InnerVolumeSpecName "kube-api-access-shvhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:32:04 crc kubenswrapper[4708]: I0320 16:32:04.792956 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shvhr\" (UniqueName: \"kubernetes.io/projected/efb76c0f-8f46-4c9f-b25b-62cc462c9f43-kube-api-access-shvhr\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:05 crc kubenswrapper[4708]: I0320 16:32:05.283385 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567072-kp2h6" event={"ID":"efb76c0f-8f46-4c9f-b25b-62cc462c9f43","Type":"ContainerDied","Data":"8480948f1e3bae12c86b29b77dea9462d59e9917aece05241d11c6f416afb9e3"} Mar 20 16:32:05 crc kubenswrapper[4708]: I0320 16:32:05.284056 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8480948f1e3bae12c86b29b77dea9462d59e9917aece05241d11c6f416afb9e3" Mar 20 16:32:05 crc kubenswrapper[4708]: I0320 16:32:05.283525 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-kp2h6" Mar 20 16:32:05 crc kubenswrapper[4708]: I0320 16:32:05.683931 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-thfp9"] Mar 20 16:32:05 crc kubenswrapper[4708]: I0320 16:32:05.692311 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-thfp9"] Mar 20 16:32:06 crc kubenswrapper[4708]: I0320 16:32:06.124136 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3749e625-ee58-48e8-858d-2b95c1e33b05" path="/var/lib/kubelet/pods/3749e625-ee58-48e8-858d-2b95c1e33b05/volumes" Mar 20 16:32:10 crc kubenswrapper[4708]: I0320 16:32:10.034741 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-zk9cx"] Mar 20 16:32:10 crc kubenswrapper[4708]: I0320 16:32:10.044970 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-zk9cx"] Mar 20 16:32:10 crc kubenswrapper[4708]: I0320 16:32:10.128225 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="204070bf-f103-49d9-b366-185454e68b9e" path="/var/lib/kubelet/pods/204070bf-f103-49d9-b366-185454e68b9e/volumes" Mar 20 16:32:10 crc kubenswrapper[4708]: I0320 16:32:10.537476 4708 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:32:10 crc kubenswrapper[4708]: I0320 16:32:10.584035 4708 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:32:10 crc kubenswrapper[4708]: I0320 16:32:10.776372 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nhv8w"] Mar 20 16:32:11 crc kubenswrapper[4708]: I0320 16:32:11.033765 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fgjlj"] Mar 20 16:32:11 crc kubenswrapper[4708]: I0320 16:32:11.044385 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fgjlj"] Mar 20 16:32:12 crc kubenswrapper[4708]: I0320 16:32:12.126043 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c46a759f-98ab-495d-9cab-ba1f2fbbb112" path="/var/lib/kubelet/pods/c46a759f-98ab-495d-9cab-ba1f2fbbb112/volumes" Mar 20 16:32:12 crc kubenswrapper[4708]: I0320 16:32:12.343387 4708 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nhv8w" podUID="b3f98a64-a6b6-4d85-be38-77d4327c8782" containerName="registry-server" containerID="cri-o://fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f" gracePeriod=2 Mar 20 16:32:12 crc kubenswrapper[4708]: I0320 16:32:12.873236 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.050598 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f98a64-a6b6-4d85-be38-77d4327c8782-catalog-content\") pod \"b3f98a64-a6b6-4d85-be38-77d4327c8782\" (UID: \"b3f98a64-a6b6-4d85-be38-77d4327c8782\") " Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.050657 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f98a64-a6b6-4d85-be38-77d4327c8782-utilities\") pod \"b3f98a64-a6b6-4d85-be38-77d4327c8782\" (UID: \"b3f98a64-a6b6-4d85-be38-77d4327c8782\") " Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.050952 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrstp\" (UniqueName: \"kubernetes.io/projected/b3f98a64-a6b6-4d85-be38-77d4327c8782-kube-api-access-hrstp\") pod \"b3f98a64-a6b6-4d85-be38-77d4327c8782\" (UID: \"b3f98a64-a6b6-4d85-be38-77d4327c8782\") " Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.051775 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f98a64-a6b6-4d85-be38-77d4327c8782-utilities" (OuterVolumeSpecName: "utilities") pod "b3f98a64-a6b6-4d85-be38-77d4327c8782" (UID: "b3f98a64-a6b6-4d85-be38-77d4327c8782"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.056338 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3f98a64-a6b6-4d85-be38-77d4327c8782-kube-api-access-hrstp" (OuterVolumeSpecName: "kube-api-access-hrstp") pod "b3f98a64-a6b6-4d85-be38-77d4327c8782" (UID: "b3f98a64-a6b6-4d85-be38-77d4327c8782"). InnerVolumeSpecName "kube-api-access-hrstp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.111491 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:32:13 crc kubenswrapper[4708]: E0320 16:32:13.111973 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.153144 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrstp\" (UniqueName: \"kubernetes.io/projected/b3f98a64-a6b6-4d85-be38-77d4327c8782-kube-api-access-hrstp\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.153176 4708 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f98a64-a6b6-4d85-be38-77d4327c8782-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.203545 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f98a64-a6b6-4d85-be38-77d4327c8782-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3f98a64-a6b6-4d85-be38-77d4327c8782" (UID: "b3f98a64-a6b6-4d85-be38-77d4327c8782"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.255018 4708 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f98a64-a6b6-4d85-be38-77d4327c8782-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.357819 4708 generic.go:334] "Generic (PLEG): container finished" podID="b3f98a64-a6b6-4d85-be38-77d4327c8782" containerID="fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f" exitCode=0 Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.357860 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhv8w" event={"ID":"b3f98a64-a6b6-4d85-be38-77d4327c8782","Type":"ContainerDied","Data":"fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f"} Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.357897 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhv8w" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.357913 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhv8w" event={"ID":"b3f98a64-a6b6-4d85-be38-77d4327c8782","Type":"ContainerDied","Data":"349ca748629f87f013e18b81ed59db4f4ee2bbdb9776c5541e724f486a18af5d"} Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.357931 4708 scope.go:117] "RemoveContainer" containerID="fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.390406 4708 scope.go:117] "RemoveContainer" containerID="6a3ee58015bab92dbbb4c20ecf7783b854cd301fb2582f7da0dea835e10be926" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.396860 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nhv8w"] Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.409828 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nhv8w"] Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.436619 4708 scope.go:117] "RemoveContainer" containerID="27c08a91d838cbafc43b32f424711b7a3e8b0db312cf98993a30153580218165" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.483073 4708 scope.go:117] "RemoveContainer" containerID="fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f" Mar 20 16:32:13 crc kubenswrapper[4708]: E0320 16:32:13.483593 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f\": container with ID starting with fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f not found: ID does not exist" containerID="fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.483658 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f"} err="failed to get container status \"fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f\": rpc error: code = NotFound desc = could not find container \"fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f\": container with ID starting with fd52f8b92737b399a3b4854d404cd93e921bee825cb280a0bc0bce582a7d592f not found: ID does not exist" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.483721 4708 scope.go:117] "RemoveContainer" containerID="6a3ee58015bab92dbbb4c20ecf7783b854cd301fb2582f7da0dea835e10be926" Mar 20 16:32:13 crc kubenswrapper[4708]: E0320 16:32:13.484362 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a3ee58015bab92dbbb4c20ecf7783b854cd301fb2582f7da0dea835e10be926\": container with ID starting with 6a3ee58015bab92dbbb4c20ecf7783b854cd301fb2582f7da0dea835e10be926 not found: ID does not exist" containerID="6a3ee58015bab92dbbb4c20ecf7783b854cd301fb2582f7da0dea835e10be926" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.484414 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a3ee58015bab92dbbb4c20ecf7783b854cd301fb2582f7da0dea835e10be926"} err="failed to get container status \"6a3ee58015bab92dbbb4c20ecf7783b854cd301fb2582f7da0dea835e10be926\": rpc error: code = NotFound desc = could not find container \"6a3ee58015bab92dbbb4c20ecf7783b854cd301fb2582f7da0dea835e10be926\": container with ID starting with 6a3ee58015bab92dbbb4c20ecf7783b854cd301fb2582f7da0dea835e10be926 not found: ID does not exist" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.484449 4708 scope.go:117] "RemoveContainer" containerID="27c08a91d838cbafc43b32f424711b7a3e8b0db312cf98993a30153580218165" Mar 20 16:32:13 crc kubenswrapper[4708]: E0320 16:32:13.484778 4708 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27c08a91d838cbafc43b32f424711b7a3e8b0db312cf98993a30153580218165\": container with ID starting with 27c08a91d838cbafc43b32f424711b7a3e8b0db312cf98993a30153580218165 not found: ID does not exist" containerID="27c08a91d838cbafc43b32f424711b7a3e8b0db312cf98993a30153580218165" Mar 20 16:32:13 crc kubenswrapper[4708]: I0320 16:32:13.484817 4708 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27c08a91d838cbafc43b32f424711b7a3e8b0db312cf98993a30153580218165"} err="failed to get container status \"27c08a91d838cbafc43b32f424711b7a3e8b0db312cf98993a30153580218165\": rpc error: code = NotFound desc = could not find container \"27c08a91d838cbafc43b32f424711b7a3e8b0db312cf98993a30153580218165\": container with ID starting with 27c08a91d838cbafc43b32f424711b7a3e8b0db312cf98993a30153580218165 not found: ID does not exist" Mar 20 16:32:14 crc kubenswrapper[4708]: I0320 16:32:14.125588 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3f98a64-a6b6-4d85-be38-77d4327c8782" path="/var/lib/kubelet/pods/b3f98a64-a6b6-4d85-be38-77d4327c8782/volumes" Mar 20 16:32:19 crc kubenswrapper[4708]: I0320 16:32:19.040341 4708 scope.go:117] "RemoveContainer" containerID="89897c70acf5379db0001e6266077a0415262c310b34144e5e0a9c90860b2dd5" Mar 20 16:32:19 crc kubenswrapper[4708]: I0320 16:32:19.070648 4708 scope.go:117] "RemoveContainer" containerID="0214912007e228e729cb351ec91fad5c2f6ba03158fcc635b3bae094ab2c476a" Mar 20 16:32:19 crc kubenswrapper[4708]: I0320 16:32:19.131101 4708 scope.go:117] "RemoveContainer" containerID="b59ea8e7c252be39ac0d296ea25722705ba81ac9da0d2242f50c7bd36f05d703" Mar 20 16:32:19 crc kubenswrapper[4708]: I0320 16:32:19.176442 4708 scope.go:117] "RemoveContainer" containerID="c26bc67a464e71977a18197777127053a8d65ea3985cde8cb4f1829105e1bc4e" Mar 20 16:32:19 crc kubenswrapper[4708]: I0320 16:32:19.224318 4708 scope.go:117] "RemoveContainer" containerID="b4456b51d786af4b0433c292a77b61ade2e20ded4c15e1d30615eb8b6fcee510" Mar 20 16:32:19 crc kubenswrapper[4708]: I0320 16:32:19.273827 4708 scope.go:117] "RemoveContainer" containerID="612769ba32a5de6e87931c0a149f996bebf73631f256e8394b11aa927a7f24f1" Mar 20 16:32:19 crc kubenswrapper[4708]: I0320 16:32:19.331329 4708 scope.go:117] "RemoveContainer" containerID="26b6e59e594f71b347641897b67adbb4240aece918c6018f1e81ff0c16b097cd" Mar 20 16:32:25 crc kubenswrapper[4708]: I0320 16:32:25.112304 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:32:25 crc kubenswrapper[4708]: E0320 16:32:25.114421 4708 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgbv9_openshift-machine-config-operator(fbd987d1-f981-4e7a-b063-920f84a0d7f6)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" Mar 20 16:32:38 crc kubenswrapper[4708]: I0320 16:32:38.111597 4708 scope.go:117] "RemoveContainer" containerID="722c1e5994a45e916a466fee6dc6200430e7e4c60f4db672361b70c0bf634445" Mar 20 16:32:38 crc kubenswrapper[4708]: I0320 16:32:38.594764 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" event={"ID":"fbd987d1-f981-4e7a-b063-920f84a0d7f6","Type":"ContainerStarted","Data":"f26ffc3fdf9a5da62820cf14b12e60842fbdca1bc9853bbdc8aeb1f9f6a7f346"} Mar 20 16:32:55 crc kubenswrapper[4708]: I0320 16:32:55.054198 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1d98-account-create-update-vvbxv"] Mar 20 16:32:55 crc kubenswrapper[4708]: I0320 16:32:55.065231 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1d98-account-create-update-vvbxv"] Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.034059 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-2vzqh"] Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.042214 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-qc5ff"] Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.050054 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-2vzqh"] Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.057889 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-qc5ff"] Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.066999 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0234-account-create-update-krdbf"] Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.074828 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-f9fb-account-create-update-ngw6v"] Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.083199 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0234-account-create-update-krdbf"] Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.091718 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wwl4c"] Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.099587 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-f9fb-account-create-update-ngw6v"] Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.106144 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wwl4c"] Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.123256 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc" path="/var/lib/kubelet/pods/22dccba0-f2ba-4c32-8d42-a6fa8dd9fefc/volumes" Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.124515 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3116dd2a-d2d0-46cf-837d-56d29a7e116f" path="/var/lib/kubelet/pods/3116dd2a-d2d0-46cf-837d-56d29a7e116f/volumes" Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.125418 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="394a21a5-81ce-4b43-8642-70a03a4a0685" path="/var/lib/kubelet/pods/394a21a5-81ce-4b43-8642-70a03a4a0685/volumes" Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.126171 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f246f8c-2e08-400c-af52-746be688f708" path="/var/lib/kubelet/pods/3f246f8c-2e08-400c-af52-746be688f708/volumes" Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.127249 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630fc775-bda7-45ac-9852-650855479072" path="/var/lib/kubelet/pods/630fc775-bda7-45ac-9852-650855479072/volumes" Mar 20 16:32:56 crc kubenswrapper[4708]: I0320 16:32:56.127909 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0b66f79-c7ee-40c3-a026-0c42a0648f11" path="/var/lib/kubelet/pods/b0b66f79-c7ee-40c3-a026-0c42a0648f11/volumes" Mar 20 16:33:19 crc kubenswrapper[4708]: I0320 16:33:19.701619 4708 scope.go:117] "RemoveContainer" containerID="ef1d3bb93632a5560ad920acab64b1d767f12e941660c4b340e61063ebd5a674" Mar 20 16:33:19 crc kubenswrapper[4708]: I0320 16:33:19.891657 4708 scope.go:117] "RemoveContainer" containerID="687b9c422e841e69520bc01f976381a4e90d6d634600469ca2f0b8c4d8dfdcf9" Mar 20 16:33:19 crc kubenswrapper[4708]: I0320 16:33:19.937709 4708 scope.go:117] "RemoveContainer" containerID="a9e1f8effa424893dcfd6f68b1d52bfcda3c6d15bfe308378dbc4526a749d5c2" Mar 20 16:33:19 crc kubenswrapper[4708]: I0320 16:33:19.970154 4708 scope.go:117] "RemoveContainer" containerID="16e1a51a2faac0c5273d327fd38ea9b0770a0d58a7a65ec9df7fb3a1ef1f562a" Mar 20 16:33:20 crc kubenswrapper[4708]: I0320 16:33:20.009006 4708 scope.go:117] "RemoveContainer" containerID="4fdeea389bf333390f5cf82b8084e149563364b6d640b9454792c9f0e2458f82" Mar 20 16:33:20 crc kubenswrapper[4708]: I0320 16:33:20.049325 4708 scope.go:117] "RemoveContainer" containerID="01fc1382778da1917738bf00c90b8ce2d1e7ef86718a5766543790d531d34d0a" Mar 20 16:33:27 crc kubenswrapper[4708]: I0320 16:33:27.040137 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dkqq"] Mar 20 16:33:27 crc kubenswrapper[4708]: I0320 16:33:27.048307 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dkqq"] Mar 20 16:33:28 crc kubenswrapper[4708]: I0320 16:33:28.122236 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d679da4d-509e-49d9-a465-405bda8b3e2d" path="/var/lib/kubelet/pods/d679da4d-509e-49d9-a465-405bda8b3e2d/volumes" Mar 20 16:33:49 crc kubenswrapper[4708]: I0320 16:33:49.034566 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-2hpvr"] Mar 20 16:33:49 crc kubenswrapper[4708]: I0320 16:33:49.041729 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-2hpvr"] Mar 20 16:33:50 crc kubenswrapper[4708]: I0320 16:33:50.125065 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af676e7f-5129-436c-9451-8a9b1c8c19c0" path="/var/lib/kubelet/pods/af676e7f-5129-436c-9451-8a9b1c8c19c0/volumes" Mar 20 16:33:51 crc kubenswrapper[4708]: I0320 16:33:51.025140 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpnrd"] Mar 20 16:33:51 crc kubenswrapper[4708]: I0320 16:33:51.032058 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpnrd"] Mar 20 16:33:52 crc kubenswrapper[4708]: I0320 16:33:52.120523 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="384f206f-1142-4027-90ef-7adfeda8a5f5" path="/var/lib/kubelet/pods/384f206f-1142-4027-90ef-7adfeda8a5f5/volumes" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.154540 4708 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567074-wfp4g"] Mar 20 16:34:00 crc kubenswrapper[4708]: E0320 16:34:00.155619 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f98a64-a6b6-4d85-be38-77d4327c8782" containerName="extract-content" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.155635 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f98a64-a6b6-4d85-be38-77d4327c8782" containerName="extract-content" Mar 20 16:34:00 crc kubenswrapper[4708]: E0320 16:34:00.155656 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f98a64-a6b6-4d85-be38-77d4327c8782" containerName="extract-utilities" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.155682 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f98a64-a6b6-4d85-be38-77d4327c8782" containerName="extract-utilities" Mar 20 16:34:00 crc kubenswrapper[4708]: E0320 16:34:00.155690 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f98a64-a6b6-4d85-be38-77d4327c8782" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.155697 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f98a64-a6b6-4d85-be38-77d4327c8782" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4708]: E0320 16:34:00.155716 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127f3d9f-81ae-48d4-ac16-0dba3912296e" containerName="extract-content" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.155722 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="127f3d9f-81ae-48d4-ac16-0dba3912296e" containerName="extract-content" Mar 20 16:34:00 crc kubenswrapper[4708]: E0320 16:34:00.155733 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb76c0f-8f46-4c9f-b25b-62cc462c9f43" containerName="oc" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.155740 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb76c0f-8f46-4c9f-b25b-62cc462c9f43" containerName="oc" Mar 20 16:34:00 crc kubenswrapper[4708]: E0320 16:34:00.155759 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127f3d9f-81ae-48d4-ac16-0dba3912296e" containerName="extract-utilities" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.155766 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="127f3d9f-81ae-48d4-ac16-0dba3912296e" containerName="extract-utilities" Mar 20 16:34:00 crc kubenswrapper[4708]: E0320 16:34:00.155786 4708 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127f3d9f-81ae-48d4-ac16-0dba3912296e" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.155794 4708 state_mem.go:107] "Deleted CPUSet assignment" podUID="127f3d9f-81ae-48d4-ac16-0dba3912296e" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.155990 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="127f3d9f-81ae-48d4-ac16-0dba3912296e" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.156010 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb76c0f-8f46-4c9f-b25b-62cc462c9f43" containerName="oc" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.156027 4708 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3f98a64-a6b6-4d85-be38-77d4327c8782" containerName="registry-server" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.156755 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-wfp4g" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.159393 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.160995 4708 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.161026 4708 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pwrt5" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.162168 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567074-wfp4g"] Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.315372 4708 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxsw8\" (UniqueName: \"kubernetes.io/projected/1fd79d41-4a38-4bf0-9b07-0f813b875248-kube-api-access-lxsw8\") pod \"auto-csr-approver-29567074-wfp4g\" (UID: \"1fd79d41-4a38-4bf0-9b07-0f813b875248\") " pod="openshift-infra/auto-csr-approver-29567074-wfp4g" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.417527 4708 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxsw8\" (UniqueName: \"kubernetes.io/projected/1fd79d41-4a38-4bf0-9b07-0f813b875248-kube-api-access-lxsw8\") pod \"auto-csr-approver-29567074-wfp4g\" (UID: \"1fd79d41-4a38-4bf0-9b07-0f813b875248\") " pod="openshift-infra/auto-csr-approver-29567074-wfp4g" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.439202 4708 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxsw8\" (UniqueName: \"kubernetes.io/projected/1fd79d41-4a38-4bf0-9b07-0f813b875248-kube-api-access-lxsw8\") pod \"auto-csr-approver-29567074-wfp4g\" (UID: \"1fd79d41-4a38-4bf0-9b07-0f813b875248\") " pod="openshift-infra/auto-csr-approver-29567074-wfp4g" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.478815 4708 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-wfp4g" Mar 20 16:34:00 crc kubenswrapper[4708]: I0320 16:34:00.921447 4708 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567074-wfp4g"] Mar 20 16:34:01 crc kubenswrapper[4708]: I0320 16:34:01.077002 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-wfp4g" event={"ID":"1fd79d41-4a38-4bf0-9b07-0f813b875248","Type":"ContainerStarted","Data":"c03b1f2e9510cef374d3857bfc768c295ddb124715011e17d1d0167ad3321c45"} Mar 20 16:34:03 crc kubenswrapper[4708]: I0320 16:34:03.097862 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-wfp4g" event={"ID":"1fd79d41-4a38-4bf0-9b07-0f813b875248","Type":"ContainerStarted","Data":"6a42deabed52bf763815cac8bc8c185fd998f10b3efdb30c796267a8bc4630ef"} Mar 20 16:34:03 crc kubenswrapper[4708]: I0320 16:34:03.114557 4708 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567074-wfp4g" podStartSLOduration=1.298450597 podStartE2EDuration="3.114540386s" podCreationTimestamp="2026-03-20 16:34:00 +0000 UTC" firstStartedPulling="2026-03-20 16:34:00.931180109 +0000 UTC m=+1995.605516824" lastFinishedPulling="2026-03-20 16:34:02.747269898 +0000 UTC m=+1997.421606613" observedRunningTime="2026-03-20 16:34:03.109953591 +0000 UTC m=+1997.784290306" watchObservedRunningTime="2026-03-20 16:34:03.114540386 +0000 UTC m=+1997.788877101" Mar 20 16:34:04 crc kubenswrapper[4708]: I0320 16:34:04.110774 4708 generic.go:334] "Generic (PLEG): container finished" podID="1fd79d41-4a38-4bf0-9b07-0f813b875248" containerID="6a42deabed52bf763815cac8bc8c185fd998f10b3efdb30c796267a8bc4630ef" exitCode=0 Mar 20 16:34:04 crc kubenswrapper[4708]: I0320 16:34:04.126443 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-wfp4g" event={"ID":"1fd79d41-4a38-4bf0-9b07-0f813b875248","Type":"ContainerDied","Data":"6a42deabed52bf763815cac8bc8c185fd998f10b3efdb30c796267a8bc4630ef"} Mar 20 16:34:05 crc kubenswrapper[4708]: I0320 16:34:05.469024 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-wfp4g" Mar 20 16:34:05 crc kubenswrapper[4708]: I0320 16:34:05.623039 4708 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxsw8\" (UniqueName: \"kubernetes.io/projected/1fd79d41-4a38-4bf0-9b07-0f813b875248-kube-api-access-lxsw8\") pod \"1fd79d41-4a38-4bf0-9b07-0f813b875248\" (UID: \"1fd79d41-4a38-4bf0-9b07-0f813b875248\") " Mar 20 16:34:05 crc kubenswrapper[4708]: I0320 16:34:05.632236 4708 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd79d41-4a38-4bf0-9b07-0f813b875248-kube-api-access-lxsw8" (OuterVolumeSpecName: "kube-api-access-lxsw8") pod "1fd79d41-4a38-4bf0-9b07-0f813b875248" (UID: "1fd79d41-4a38-4bf0-9b07-0f813b875248"). InnerVolumeSpecName "kube-api-access-lxsw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:34:05 crc kubenswrapper[4708]: I0320 16:34:05.726845 4708 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxsw8\" (UniqueName: \"kubernetes.io/projected/1fd79d41-4a38-4bf0-9b07-0f813b875248-kube-api-access-lxsw8\") on node \"crc\" DevicePath \"\"" Mar 20 16:34:06 crc kubenswrapper[4708]: I0320 16:34:06.129294 4708 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-wfp4g" event={"ID":"1fd79d41-4a38-4bf0-9b07-0f813b875248","Type":"ContainerDied","Data":"c03b1f2e9510cef374d3857bfc768c295ddb124715011e17d1d0167ad3321c45"} Mar 20 16:34:06 crc kubenswrapper[4708]: I0320 16:34:06.129330 4708 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-wfp4g" Mar 20 16:34:06 crc kubenswrapper[4708]: I0320 16:34:06.129341 4708 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c03b1f2e9510cef374d3857bfc768c295ddb124715011e17d1d0167ad3321c45" Mar 20 16:34:06 crc kubenswrapper[4708]: I0320 16:34:06.196579 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-tv98j"] Mar 20 16:34:06 crc kubenswrapper[4708]: I0320 16:34:06.204510 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-tv98j"] Mar 20 16:34:08 crc kubenswrapper[4708]: I0320 16:34:08.125976 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b717bca9-34e7-41fb-b9ac-d65c674a3d22" path="/var/lib/kubelet/pods/b717bca9-34e7-41fb-b9ac-d65c674a3d22/volumes" Mar 20 16:34:20 crc kubenswrapper[4708]: I0320 16:34:20.184840 4708 scope.go:117] "RemoveContainer" containerID="27965af84deaa4be5455820c418d957a265e1e56d40386b38b3c24d1d1e96aef" Mar 20 16:34:20 crc kubenswrapper[4708]: I0320 16:34:20.226575 4708 scope.go:117] "RemoveContainer" containerID="b1ec4d5b3497d1f22acdc9688aa9dfb36a8195784ada83e5b4f1e1bea401c955" Mar 20 16:34:20 crc kubenswrapper[4708]: I0320 16:34:20.279522 4708 scope.go:117] "RemoveContainer" containerID="83a49cf7103d63e64e2e66b7938372ce9a84e5e438e97eb4aaf4e992b32eab44" Mar 20 16:34:20 crc kubenswrapper[4708]: I0320 16:34:20.331453 4708 scope.go:117] "RemoveContainer" containerID="b3655bcb7ad6f49cf31a88c893fcf8827f27a99cacd08b0af71387e156bbbf09" Mar 20 16:34:35 crc kubenswrapper[4708]: I0320 16:34:35.047547 4708 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-nktb8"] Mar 20 16:34:35 crc kubenswrapper[4708]: I0320 16:34:35.059536 4708 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-nktb8"] Mar 20 16:34:36 crc kubenswrapper[4708]: I0320 16:34:36.120945 4708 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="243ed677-1e35-40c7-ba55-a90eb9c7b85c" path="/var/lib/kubelet/pods/243ed677-1e35-40c7-ba55-a90eb9c7b85c/volumes" Mar 20 16:34:56 crc kubenswrapper[4708]: I0320 16:34:56.179025 4708 patch_prober.go:28] interesting pod/machine-config-daemon-sgbv9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:34:56 crc kubenswrapper[4708]: I0320 16:34:56.179707 4708 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgbv9" podUID="fbd987d1-f981-4e7a-b063-920f84a0d7f6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"